Nov 25 15:35:12 crc systemd[1]: Starting Kubernetes Kubelet... Nov 25 15:35:12 crc restorecon[4694]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:12 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:35:13 crc restorecon[4694]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 15:35:13 crc restorecon[4694]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Nov 25 15:35:14 crc kubenswrapper[4704]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 25 15:35:14 crc kubenswrapper[4704]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Nov 25 15:35:14 crc kubenswrapper[4704]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 25 15:35:14 crc kubenswrapper[4704]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 25 15:35:14 crc kubenswrapper[4704]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Nov 25 15:35:14 crc kubenswrapper[4704]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.199692 4704 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206751 4704 feature_gate.go:330] unrecognized feature gate: Example Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206808 4704 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206816 4704 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206822 4704 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206829 4704 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206836 4704 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206841 4704 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206846 4704 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206851 4704 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206856 4704 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206861 4704 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206865 4704 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206869 4704 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206874 4704 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206878 4704 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206882 4704 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206886 4704 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206890 4704 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206895 4704 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206899 4704 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206904 4704 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206913 4704 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206917 4704 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206922 4704 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206926 4704 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206933 4704 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206940 4704 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206945 4704 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206950 4704 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206957 4704 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206963 4704 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206969 4704 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206977 4704 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206984 4704 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206990 4704 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.206995 4704 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207000 4704 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207004 4704 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207009 4704 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207013 4704 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207020 4704 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207026 4704 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207031 4704 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207036 4704 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207040 4704 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207045 4704 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207050 4704 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207054 4704 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207059 4704 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207063 4704 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207068 4704 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207072 4704 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207077 4704 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207081 4704 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207086 4704 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207091 4704 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207096 4704 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207103 4704 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207108 4704 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207113 4704 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207120 4704 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207124 4704 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207130 4704 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207135 4704 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207140 4704 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207144 4704 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207148 4704 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207153 4704 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207157 4704 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207161 4704 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.207166 4704 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208202 4704 flags.go:64] FLAG: --address="0.0.0.0" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208223 4704 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208235 4704 flags.go:64] FLAG: --anonymous-auth="true" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208241 4704 flags.go:64] FLAG: --application-metrics-count-limit="100" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208252 4704 flags.go:64] FLAG: --authentication-token-webhook="false" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208256 4704 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208263 4704 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208268 4704 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208273 4704 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208277 4704 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208281 4704 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208286 4704 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208290 4704 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208294 4704 flags.go:64] FLAG: --cgroup-root="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208298 4704 flags.go:64] FLAG: --cgroups-per-qos="true" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208302 4704 flags.go:64] FLAG: --client-ca-file="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208306 4704 flags.go:64] FLAG: --cloud-config="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208310 4704 flags.go:64] FLAG: --cloud-provider="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208315 4704 flags.go:64] FLAG: --cluster-dns="[]" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208320 4704 flags.go:64] FLAG: --cluster-domain="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208324 4704 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208328 4704 flags.go:64] FLAG: --config-dir="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208333 4704 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208338 4704 flags.go:64] FLAG: --container-log-max-files="5" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208343 4704 flags.go:64] FLAG: --container-log-max-size="10Mi" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208349 4704 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208355 4704 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208360 4704 flags.go:64] FLAG: --containerd-namespace="k8s.io" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208366 4704 flags.go:64] FLAG: --contention-profiling="false" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208371 4704 flags.go:64] FLAG: --cpu-cfs-quota="true" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208376 4704 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208380 4704 flags.go:64] FLAG: --cpu-manager-policy="none" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208384 4704 flags.go:64] FLAG: --cpu-manager-policy-options="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208390 4704 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208394 4704 flags.go:64] FLAG: --enable-controller-attach-detach="true" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208398 4704 flags.go:64] FLAG: --enable-debugging-handlers="true" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208403 4704 flags.go:64] FLAG: --enable-load-reader="false" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208407 4704 flags.go:64] FLAG: --enable-server="true" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208411 4704 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208417 4704 flags.go:64] FLAG: --event-burst="100" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208421 4704 flags.go:64] FLAG: --event-qps="50" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208426 4704 flags.go:64] FLAG: --event-storage-age-limit="default=0" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208431 4704 flags.go:64] FLAG: --event-storage-event-limit="default=0" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208436 4704 flags.go:64] FLAG: --eviction-hard="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208443 4704 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208449 4704 flags.go:64] FLAG: --eviction-minimum-reclaim="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208454 4704 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208460 4704 flags.go:64] FLAG: --eviction-soft="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208465 4704 flags.go:64] FLAG: --eviction-soft-grace-period="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208470 4704 flags.go:64] FLAG: --exit-on-lock-contention="false" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208475 4704 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208480 4704 flags.go:64] FLAG: --experimental-mounter-path="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208485 4704 flags.go:64] FLAG: --fail-cgroupv1="false" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208489 4704 flags.go:64] FLAG: --fail-swap-on="true" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208494 4704 flags.go:64] FLAG: --feature-gates="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208501 4704 flags.go:64] FLAG: --file-check-frequency="20s" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208506 4704 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208513 4704 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208519 4704 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208526 4704 flags.go:64] FLAG: --healthz-port="10248" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208531 4704 flags.go:64] FLAG: --help="false" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208536 4704 flags.go:64] FLAG: --hostname-override="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208541 4704 flags.go:64] FLAG: --housekeeping-interval="10s" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208545 4704 flags.go:64] FLAG: --http-check-frequency="20s" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208550 4704 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208554 4704 flags.go:64] FLAG: --image-credential-provider-config="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208557 4704 flags.go:64] FLAG: --image-gc-high-threshold="85" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208562 4704 flags.go:64] FLAG: --image-gc-low-threshold="80" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208566 4704 flags.go:64] FLAG: --image-service-endpoint="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208570 4704 flags.go:64] FLAG: --kernel-memcg-notification="false" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208574 4704 flags.go:64] FLAG: --kube-api-burst="100" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208578 4704 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208582 4704 flags.go:64] FLAG: --kube-api-qps="50" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208586 4704 flags.go:64] FLAG: --kube-reserved="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208591 4704 flags.go:64] FLAG: --kube-reserved-cgroup="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208595 4704 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208599 4704 flags.go:64] FLAG: --kubelet-cgroups="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208604 4704 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208608 4704 flags.go:64] FLAG: --lock-file="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208612 4704 flags.go:64] FLAG: --log-cadvisor-usage="false" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208616 4704 flags.go:64] FLAG: --log-flush-frequency="5s" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208621 4704 flags.go:64] FLAG: --log-json-info-buffer-size="0" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208628 4704 flags.go:64] FLAG: --log-json-split-stream="false" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208632 4704 flags.go:64] FLAG: --log-text-info-buffer-size="0" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208637 4704 flags.go:64] FLAG: --log-text-split-stream="false" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208642 4704 flags.go:64] FLAG: --logging-format="text" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208647 4704 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208653 4704 flags.go:64] FLAG: --make-iptables-util-chains="true" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208657 4704 flags.go:64] FLAG: --manifest-url="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208662 4704 flags.go:64] FLAG: --manifest-url-header="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208669 4704 flags.go:64] FLAG: --max-housekeeping-interval="15s" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208674 4704 flags.go:64] FLAG: --max-open-files="1000000" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208679 4704 flags.go:64] FLAG: --max-pods="110" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208683 4704 flags.go:64] FLAG: --maximum-dead-containers="-1" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208690 4704 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208694 4704 flags.go:64] FLAG: --memory-manager-policy="None" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208699 4704 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208703 4704 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208707 4704 flags.go:64] FLAG: --node-ip="192.168.126.11" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208711 4704 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208722 4704 flags.go:64] FLAG: --node-status-max-images="50" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208727 4704 flags.go:64] FLAG: --node-status-update-frequency="10s" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208732 4704 flags.go:64] FLAG: --oom-score-adj="-999" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208737 4704 flags.go:64] FLAG: --pod-cidr="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208742 4704 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208751 4704 flags.go:64] FLAG: --pod-manifest-path="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208755 4704 flags.go:64] FLAG: --pod-max-pids="-1" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208759 4704 flags.go:64] FLAG: --pods-per-core="0" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208763 4704 flags.go:64] FLAG: --port="10250" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208767 4704 flags.go:64] FLAG: --protect-kernel-defaults="false" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208771 4704 flags.go:64] FLAG: --provider-id="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208775 4704 flags.go:64] FLAG: --qos-reserved="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208780 4704 flags.go:64] FLAG: --read-only-port="10255" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208808 4704 flags.go:64] FLAG: --register-node="true" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208814 4704 flags.go:64] FLAG: --register-schedulable="true" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208818 4704 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208826 4704 flags.go:64] FLAG: --registry-burst="10" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208830 4704 flags.go:64] FLAG: --registry-qps="5" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208835 4704 flags.go:64] FLAG: --reserved-cpus="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208839 4704 flags.go:64] FLAG: --reserved-memory="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208845 4704 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208849 4704 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208854 4704 flags.go:64] FLAG: --rotate-certificates="false" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208859 4704 flags.go:64] FLAG: --rotate-server-certificates="false" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208863 4704 flags.go:64] FLAG: --runonce="false" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208867 4704 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208872 4704 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208876 4704 flags.go:64] FLAG: --seccomp-default="false" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208880 4704 flags.go:64] FLAG: --serialize-image-pulls="true" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208885 4704 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208890 4704 flags.go:64] FLAG: --storage-driver-db="cadvisor" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208894 4704 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208899 4704 flags.go:64] FLAG: --storage-driver-password="root" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208903 4704 flags.go:64] FLAG: --storage-driver-secure="false" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208907 4704 flags.go:64] FLAG: --storage-driver-table="stats" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208911 4704 flags.go:64] FLAG: --storage-driver-user="root" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208915 4704 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208920 4704 flags.go:64] FLAG: --sync-frequency="1m0s" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208924 4704 flags.go:64] FLAG: --system-cgroups="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208929 4704 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208937 4704 flags.go:64] FLAG: --system-reserved-cgroup="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208942 4704 flags.go:64] FLAG: --tls-cert-file="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208947 4704 flags.go:64] FLAG: --tls-cipher-suites="[]" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208954 4704 flags.go:64] FLAG: --tls-min-version="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208959 4704 flags.go:64] FLAG: --tls-private-key-file="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208963 4704 flags.go:64] FLAG: --topology-manager-policy="none" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208968 4704 flags.go:64] FLAG: --topology-manager-policy-options="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208972 4704 flags.go:64] FLAG: --topology-manager-scope="container" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208976 4704 flags.go:64] FLAG: --v="2" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208983 4704 flags.go:64] FLAG: --version="false" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208989 4704 flags.go:64] FLAG: --vmodule="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208995 4704 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.208999 4704 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209102 4704 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209110 4704 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209114 4704 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209118 4704 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209122 4704 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209125 4704 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209129 4704 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209133 4704 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209136 4704 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209140 4704 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209144 4704 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209147 4704 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209150 4704 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209156 4704 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209160 4704 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209165 4704 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209168 4704 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209173 4704 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209178 4704 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209183 4704 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209187 4704 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209191 4704 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209195 4704 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209198 4704 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209202 4704 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209206 4704 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209210 4704 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209215 4704 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209219 4704 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209224 4704 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209228 4704 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209232 4704 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209237 4704 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209242 4704 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209248 4704 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209252 4704 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209256 4704 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209260 4704 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209264 4704 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209267 4704 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209270 4704 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209274 4704 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209279 4704 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209283 4704 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209287 4704 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209291 4704 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209295 4704 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209300 4704 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209304 4704 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209309 4704 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209314 4704 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209319 4704 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209323 4704 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209327 4704 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209332 4704 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209336 4704 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209344 4704 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209350 4704 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209355 4704 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209359 4704 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209363 4704 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209367 4704 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209371 4704 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209375 4704 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209378 4704 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209382 4704 feature_gate.go:330] unrecognized feature gate: Example Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209387 4704 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209391 4704 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209396 4704 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209400 4704 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.209404 4704 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.209418 4704 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.223252 4704 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.223284 4704 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223349 4704 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223357 4704 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223362 4704 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223367 4704 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223372 4704 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223377 4704 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223381 4704 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223385 4704 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223388 4704 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223392 4704 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223396 4704 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223400 4704 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223403 4704 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223408 4704 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223412 4704 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223415 4704 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223418 4704 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223422 4704 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223426 4704 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223429 4704 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223433 4704 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223436 4704 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223440 4704 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223443 4704 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223447 4704 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223450 4704 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223454 4704 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223458 4704 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223461 4704 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223464 4704 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223499 4704 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223504 4704 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223509 4704 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223514 4704 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223519 4704 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223524 4704 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223528 4704 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223532 4704 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223537 4704 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223542 4704 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223547 4704 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223551 4704 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223554 4704 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223558 4704 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223562 4704 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223565 4704 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223569 4704 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223573 4704 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223576 4704 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223580 4704 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223584 4704 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223587 4704 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223591 4704 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223594 4704 feature_gate.go:330] unrecognized feature gate: Example Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223599 4704 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223604 4704 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223608 4704 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223611 4704 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223615 4704 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223619 4704 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223622 4704 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223627 4704 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223632 4704 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223637 4704 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223641 4704 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223645 4704 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223650 4704 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223654 4704 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223658 4704 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223662 4704 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223667 4704 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.223674 4704 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223918 4704 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223932 4704 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223937 4704 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223943 4704 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223948 4704 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223953 4704 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223958 4704 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223963 4704 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223968 4704 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223972 4704 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223977 4704 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223982 4704 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223986 4704 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223991 4704 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.223995 4704 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224000 4704 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224006 4704 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224013 4704 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224017 4704 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224022 4704 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224027 4704 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224032 4704 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224038 4704 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224043 4704 feature_gate.go:330] unrecognized feature gate: Example Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224048 4704 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224053 4704 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224058 4704 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224063 4704 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224067 4704 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224073 4704 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224078 4704 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224083 4704 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224088 4704 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224093 4704 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224098 4704 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224103 4704 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224108 4704 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224113 4704 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224119 4704 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224125 4704 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224130 4704 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224135 4704 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224140 4704 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224145 4704 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224149 4704 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224154 4704 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224159 4704 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224163 4704 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224167 4704 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224171 4704 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224176 4704 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224180 4704 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224186 4704 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224193 4704 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224197 4704 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224203 4704 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224207 4704 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224212 4704 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224217 4704 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224221 4704 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224226 4704 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224230 4704 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224235 4704 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224239 4704 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224243 4704 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224248 4704 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224254 4704 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224260 4704 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224264 4704 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224269 4704 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.224274 4704 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.224282 4704 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.225702 4704 server.go:940] "Client rotation is on, will bootstrap in background" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.229913 4704 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.230005 4704 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.231664 4704 server.go:997] "Starting client certificate rotation" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.231694 4704 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.231915 4704 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-10 01:35:19.238875372 +0000 UTC Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.232004 4704 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.256588 4704 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.258620 4704 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 25 15:35:14 crc kubenswrapper[4704]: E1125 15:35:14.259559 4704 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.269745 4704 log.go:25] "Validated CRI v1 runtime API" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.302708 4704 log.go:25] "Validated CRI v1 image API" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.304828 4704 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.310405 4704 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-11-25-15-31-00-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.310431 4704 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.331318 4704 manager.go:217] Machine: {Timestamp:2025-11-25 15:35:14.326502998 +0000 UTC m=+0.594776809 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:7a7c0a64-f7eb-4637-84e2-93500c0e5ef0 BootID:fc66017c-37c1-4b18-ad41-23da03d4564b Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:94:ba:9e Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:94:ba:9e Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:22:34:75 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:98:18:1f Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:85:46:72 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:69:98:be Speed:-1 Mtu:1496} {Name:eth10 MacAddress:42:18:33:b1:27:fb Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:1a:f9:2d:38:56:f5 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.331690 4704 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.332019 4704 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.336239 4704 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.336550 4704 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.336610 4704 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.336943 4704 topology_manager.go:138] "Creating topology manager with none policy" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.336960 4704 container_manager_linux.go:303] "Creating device plugin manager" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.337568 4704 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.337612 4704 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.337977 4704 state_mem.go:36] "Initialized new in-memory state store" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.338097 4704 server.go:1245] "Using root directory" path="/var/lib/kubelet" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.340825 4704 kubelet.go:418] "Attempting to sync node with API server" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.340872 4704 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.340907 4704 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.340923 4704 kubelet.go:324] "Adding apiserver pod source" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.340939 4704 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.345038 4704 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.347996 4704 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.349133 4704 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Nov 25 15:35:14 crc kubenswrapper[4704]: E1125 15:35:14.349241 4704 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.349131 4704 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Nov 25 15:35:14 crc kubenswrapper[4704]: E1125 15:35:14.349297 4704 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.350480 4704 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.353370 4704 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.353420 4704 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.353428 4704 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.353438 4704 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.353449 4704 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.353456 4704 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.353463 4704 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.353477 4704 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.353487 4704 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.353495 4704 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.353529 4704 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.353536 4704 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.353563 4704 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.354143 4704 server.go:1280] "Started kubelet" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.355736 4704 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.356556 4704 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Nov 25 15:35:14 crc systemd[1]: Started Kubernetes Kubelet. Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.356878 4704 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.356902 4704 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.357435 4704 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.356150 4704 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.358196 4704 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 17:59:14.063846936 +0000 UTC Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.359069 4704 volume_manager.go:287] "The desired_state_of_world populator starts" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.359090 4704 volume_manager.go:289] "Starting Kubelet Volume Manager" Nov 25 15:35:14 crc kubenswrapper[4704]: E1125 15:35:14.358380 4704 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.359259 4704 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.359930 4704 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Nov 25 15:35:14 crc kubenswrapper[4704]: E1125 15:35:14.360021 4704 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.360710 4704 server.go:460] "Adding debug handlers to kubelet server" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.362115 4704 factory.go:55] Registering systemd factory Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.362190 4704 factory.go:221] Registration of the systemd container factory successfully Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.364352 4704 factory.go:153] Registering CRI-O factory Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.364389 4704 factory.go:221] Registration of the crio container factory successfully Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.364507 4704 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.364532 4704 factory.go:103] Registering Raw factory Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.364550 4704 manager.go:1196] Started watching for new ooms in manager Nov 25 15:35:14 crc kubenswrapper[4704]: E1125 15:35:14.364605 4704 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="200ms" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.365908 4704 manager.go:319] Starting recovery of all containers Nov 25 15:35:14 crc kubenswrapper[4704]: E1125 15:35:14.369415 4704 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.194:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187b49e211b39fad default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-25 15:35:14.354102189 +0000 UTC m=+0.622375990,LastTimestamp:2025-11-25 15:35:14.354102189 +0000 UTC m=+0.622375990,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.373915 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374045 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374078 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374131 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374170 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374201 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374226 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374247 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374274 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374296 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374317 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374339 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374362 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374390 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374411 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374432 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374469 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374492 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374512 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374533 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374553 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374576 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374598 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374619 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374644 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374666 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374695 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374720 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374858 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374901 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374929 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.374950 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.375238 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.375266 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.375288 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.375309 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.375330 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.375351 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.375372 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.375397 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.375419 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.375442 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.375467 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.375492 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.375516 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.375540 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.375565 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.375587 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.375615 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.375640 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.375662 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.375683 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.375716 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.375740 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.375765 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.375840 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.375893 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.375928 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.375961 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.376231 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.376264 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.376298 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.376331 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.376362 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.376404 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.376436 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.376468 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.376505 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.376548 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.376603 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.376645 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.376678 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.376708 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.376736 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.376768 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.376855 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.376899 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.376939 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.377013 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.377036 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.377094 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.377144 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.377177 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.377205 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.377233 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.377261 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.377292 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.377339 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381154 4704 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381222 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381249 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381275 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381303 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381324 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381349 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381371 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381401 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381424 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381446 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381470 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381493 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381521 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381565 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381588 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381612 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381652 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381682 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381710 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381735 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381759 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381784 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381836 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381864 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381889 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381912 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381934 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381955 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381975 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.381997 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382025 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382049 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382074 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382098 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382121 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382139 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382156 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382173 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382190 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382209 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382226 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382248 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382269 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382289 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382307 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382324 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382347 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382368 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382391 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382409 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382426 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382447 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382471 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382542 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382567 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382594 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382615 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382636 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382656 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382678 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382700 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382720 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382751 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382774 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382820 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382842 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382865 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382886 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382911 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382934 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382954 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382976 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.382998 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.383019 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.383041 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.383063 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.383084 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.383102 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.383153 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.383169 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.383189 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.383204 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.383219 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.383237 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.383254 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.383272 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.383288 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.383306 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.383322 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.383336 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.383352 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.383369 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.383385 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.383403 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.383421 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.383436 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.383454 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.383990 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.384012 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.384030 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.384046 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.384065 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.384082 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.384121 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.384139 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.384156 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.384173 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.384190 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.384206 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.384222 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.384239 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.384256 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.384274 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.384290 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.384307 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.384324 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.384341 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.384357 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.384372 4704 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.384390 4704 reconstruct.go:97] "Volume reconstruction finished" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.384402 4704 reconciler.go:26] "Reconciler: start to sync state" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.385453 4704 manager.go:324] Recovery completed Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.396542 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.398888 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.398930 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.398944 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.401281 4704 cpu_manager.go:225] "Starting CPU manager" policy="none" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.401317 4704 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.401433 4704 state_mem.go:36] "Initialized new in-memory state store" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.411823 4704 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.414315 4704 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.414540 4704 status_manager.go:217] "Starting to sync pod status with apiserver" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.414741 4704 policy_none.go:49] "None policy: Start" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.414968 4704 kubelet.go:2335] "Starting kubelet main sync loop" Nov 25 15:35:14 crc kubenswrapper[4704]: E1125 15:35:14.415213 4704 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.416404 4704 memory_manager.go:170] "Starting memorymanager" policy="None" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.416444 4704 state_mem.go:35] "Initializing new in-memory state store" Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.420493 4704 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Nov 25 15:35:14 crc kubenswrapper[4704]: E1125 15:35:14.420843 4704 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:35:14 crc kubenswrapper[4704]: E1125 15:35:14.459492 4704 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.478242 4704 manager.go:334] "Starting Device Plugin manager" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.478346 4704 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.478366 4704 server.go:79] "Starting device plugin registration server" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.478906 4704 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.478935 4704 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.479911 4704 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.480121 4704 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.480197 4704 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 25 15:35:14 crc kubenswrapper[4704]: E1125 15:35:14.488174 4704 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.516359 4704 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.516518 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.518102 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.518165 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.518176 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.518359 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.518686 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.518759 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.519476 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.519496 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.519505 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.519581 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.519655 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.519688 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.519780 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.519823 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.519836 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.520384 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.520403 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.520421 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.520491 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.520864 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.520876 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.520922 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.520890 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.520941 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.521155 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.521174 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.521182 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.521295 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.521476 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.521487 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.521499 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.521516 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.521527 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.521901 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.521926 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.521941 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.522256 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.522284 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.522300 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.522308 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.522315 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.523162 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.523184 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.523193 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:14 crc kubenswrapper[4704]: E1125 15:35:14.566060 4704 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="400ms" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.579663 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.581209 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.581263 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.581277 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.581313 4704 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 15:35:14 crc kubenswrapper[4704]: E1125 15:35:14.582054 4704 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.586420 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.586476 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.586510 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.586533 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.586557 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.586577 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.586600 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.586619 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.586641 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.586667 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.586685 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.586703 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.586726 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.586746 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.586768 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.687828 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.687884 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.687909 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.687925 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.687940 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.687959 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.687975 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.687991 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.688007 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.688024 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.688043 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.688061 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.688078 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.688094 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.688110 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.688211 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.688261 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.688285 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.688316 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.688258 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.688329 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.688375 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.688228 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.688406 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.688406 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.688335 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.688426 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.688428 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.688443 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.688614 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.782220 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.783820 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.783861 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.783876 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.783900 4704 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 15:35:14 crc kubenswrapper[4704]: E1125 15:35:14.784514 4704 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.855431 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.874357 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.880145 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.901298 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: I1125 15:35:14.909738 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.910011 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-3cdab34970f59e29eac816cdf2d6d45acf7a65d74e3186c8d4c4a73d5068de34 WatchSource:0}: Error finding container 3cdab34970f59e29eac816cdf2d6d45acf7a65d74e3186c8d4c4a73d5068de34: Status 404 returned error can't find the container with id 3cdab34970f59e29eac816cdf2d6d45acf7a65d74e3186c8d4c4a73d5068de34 Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.910753 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-97bf641ca2b02edc671ebf9edd5c7000ce342843ab4c559a29f4b37937321e66 WatchSource:0}: Error finding container 97bf641ca2b02edc671ebf9edd5c7000ce342843ab4c559a29f4b37937321e66: Status 404 returned error can't find the container with id 97bf641ca2b02edc671ebf9edd5c7000ce342843ab4c559a29f4b37937321e66 Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.926657 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-8f4d784de101ddbb15173dd045d416341c1fc93c5881f40164b67cc67eeddccc WatchSource:0}: Error finding container 8f4d784de101ddbb15173dd045d416341c1fc93c5881f40164b67cc67eeddccc: Status 404 returned error can't find the container with id 8f4d784de101ddbb15173dd045d416341c1fc93c5881f40164b67cc67eeddccc Nov 25 15:35:14 crc kubenswrapper[4704]: W1125 15:35:14.928237 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-1d512d1be2a162df37ebd743f8679bb9e87599dfa1493f8f964aac9a75aef7bf WatchSource:0}: Error finding container 1d512d1be2a162df37ebd743f8679bb9e87599dfa1493f8f964aac9a75aef7bf: Status 404 returned error can't find the container with id 1d512d1be2a162df37ebd743f8679bb9e87599dfa1493f8f964aac9a75aef7bf Nov 25 15:35:14 crc kubenswrapper[4704]: E1125 15:35:14.967403 4704 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="800ms" Nov 25 15:35:15 crc kubenswrapper[4704]: I1125 15:35:15.185690 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:15 crc kubenswrapper[4704]: I1125 15:35:15.187402 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:15 crc kubenswrapper[4704]: I1125 15:35:15.187453 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:15 crc kubenswrapper[4704]: I1125 15:35:15.187471 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:15 crc kubenswrapper[4704]: I1125 15:35:15.187499 4704 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 15:35:15 crc kubenswrapper[4704]: E1125 15:35:15.187961 4704 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Nov 25 15:35:15 crc kubenswrapper[4704]: W1125 15:35:15.188573 4704 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Nov 25 15:35:15 crc kubenswrapper[4704]: E1125 15:35:15.188643 4704 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:35:15 crc kubenswrapper[4704]: I1125 15:35:15.357725 4704 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Nov 25 15:35:15 crc kubenswrapper[4704]: I1125 15:35:15.358739 4704 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 07:18:10.255248468 +0000 UTC Nov 25 15:35:15 crc kubenswrapper[4704]: I1125 15:35:15.358821 4704 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1023h42m54.896429511s for next certificate rotation Nov 25 15:35:15 crc kubenswrapper[4704]: W1125 15:35:15.405524 4704 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Nov 25 15:35:15 crc kubenswrapper[4704]: E1125 15:35:15.405643 4704 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:35:15 crc kubenswrapper[4704]: I1125 15:35:15.420603 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1d512d1be2a162df37ebd743f8679bb9e87599dfa1493f8f964aac9a75aef7bf"} Nov 25 15:35:15 crc kubenswrapper[4704]: I1125 15:35:15.421866 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8f4d784de101ddbb15173dd045d416341c1fc93c5881f40164b67cc67eeddccc"} Nov 25 15:35:15 crc kubenswrapper[4704]: I1125 15:35:15.423733 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"dfc81f2d05f1817713002942ea6f5ecf1730181bc00546c26db266d64e786e0e"} Nov 25 15:35:15 crc kubenswrapper[4704]: I1125 15:35:15.424828 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"97bf641ca2b02edc671ebf9edd5c7000ce342843ab4c559a29f4b37937321e66"} Nov 25 15:35:15 crc kubenswrapper[4704]: I1125 15:35:15.426722 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3cdab34970f59e29eac816cdf2d6d45acf7a65d74e3186c8d4c4a73d5068de34"} Nov 25 15:35:15 crc kubenswrapper[4704]: W1125 15:35:15.471142 4704 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Nov 25 15:35:15 crc kubenswrapper[4704]: E1125 15:35:15.471270 4704 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:35:15 crc kubenswrapper[4704]: E1125 15:35:15.768563 4704 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="1.6s" Nov 25 15:35:15 crc kubenswrapper[4704]: W1125 15:35:15.829170 4704 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Nov 25 15:35:15 crc kubenswrapper[4704]: E1125 15:35:15.829249 4704 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:35:15 crc kubenswrapper[4704]: I1125 15:35:15.989094 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:15 crc kubenswrapper[4704]: I1125 15:35:15.990632 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:15 crc kubenswrapper[4704]: I1125 15:35:15.990659 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:15 crc kubenswrapper[4704]: I1125 15:35:15.990680 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:15 crc kubenswrapper[4704]: I1125 15:35:15.990702 4704 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 15:35:15 crc kubenswrapper[4704]: E1125 15:35:15.991200 4704 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.317874 4704 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Nov 25 15:35:16 crc kubenswrapper[4704]: E1125 15:35:16.318951 4704 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.358026 4704 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.431980 4704 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28" exitCode=0 Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.432086 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28"} Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.432192 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.433169 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.433201 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.433210 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.433589 4704 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6" exitCode=0 Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.433660 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6"} Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.433890 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.434651 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.435215 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.435242 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.435253 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.435384 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.435434 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.435445 4704 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="5dcaac718777a480f97457edf8afd0a55e9199d31cf857614486e92b2897541e" exitCode=0 Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.435496 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.435453 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.435488 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"5dcaac718777a480f97457edf8afd0a55e9199d31cf857614486e92b2897541e"} Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.437319 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.437378 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.437410 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.440396 4704 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="978a4be974ef4e8021721a05ea6dea880a799a5ec99b653504d862eb2f1c8c74" exitCode=0 Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.440544 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.440572 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"978a4be974ef4e8021721a05ea6dea880a799a5ec99b653504d862eb2f1c8c74"} Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.441560 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.441594 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.441605 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.444357 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6"} Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.444402 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02"} Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.444420 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c"} Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.444434 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.444436 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909"} Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.445741 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.445763 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:16 crc kubenswrapper[4704]: I1125 15:35:16.445773 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.357757 4704 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Nov 25 15:35:17 crc kubenswrapper[4704]: E1125 15:35:17.370298 4704 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="3.2s" Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.452076 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd"} Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.452138 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b"} Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.452155 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e"} Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.452166 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120"} Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.454717 4704 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec" exitCode=0 Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.454857 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec"} Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.454922 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.456280 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.456360 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.456387 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.457762 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"1ca92917da8a2f82963a21de252aca3b6ca15646ff0a30a07dfc3a0c25682691"} Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.457828 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.458670 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.458703 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.458719 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.461311 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.461428 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ea0b95608507532a76441cdae944fe025fdaf833fd16e62ca0851043de8bb308"} Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.461476 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0a86e85d42ee299ff07f19e664e89971cb63b7cc1edd398c1bedf638314ee482"} Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.461493 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"bf8b8c4cc5291456d1da07123c3cb28928c03f31896d7ecc39be043bf8b8d9ef"} Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.461504 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.462372 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.462411 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.462425 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.463269 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.463291 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.463307 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.591718 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.593432 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.593494 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.593509 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:17 crc kubenswrapper[4704]: I1125 15:35:17.593548 4704 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 15:35:17 crc kubenswrapper[4704]: E1125 15:35:17.594288 4704 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Nov 25 15:35:17 crc kubenswrapper[4704]: W1125 15:35:17.739918 4704 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Nov 25 15:35:17 crc kubenswrapper[4704]: E1125 15:35:17.740003 4704 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:35:17 crc kubenswrapper[4704]: W1125 15:35:17.794808 4704 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Nov 25 15:35:17 crc kubenswrapper[4704]: E1125 15:35:17.794925 4704 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:35:18 crc kubenswrapper[4704]: W1125 15:35:18.012330 4704 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Nov 25 15:35:18 crc kubenswrapper[4704]: E1125 15:35:18.012471 4704 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:35:18 crc kubenswrapper[4704]: E1125 15:35:18.051343 4704 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.194:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187b49e211b39fad default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-25 15:35:14.354102189 +0000 UTC m=+0.622375990,LastTimestamp:2025-11-25 15:35:14.354102189 +0000 UTC m=+0.622375990,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 25 15:35:18 crc kubenswrapper[4704]: W1125 15:35:18.055859 4704 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Nov 25 15:35:18 crc kubenswrapper[4704]: E1125 15:35:18.055926 4704 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Nov 25 15:35:18 crc kubenswrapper[4704]: I1125 15:35:18.466369 4704 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb" exitCode=0 Nov 25 15:35:18 crc kubenswrapper[4704]: I1125 15:35:18.466473 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb"} Nov 25 15:35:18 crc kubenswrapper[4704]: I1125 15:35:18.466509 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:18 crc kubenswrapper[4704]: I1125 15:35:18.467472 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:18 crc kubenswrapper[4704]: I1125 15:35:18.467510 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:18 crc kubenswrapper[4704]: I1125 15:35:18.467519 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:18 crc kubenswrapper[4704]: I1125 15:35:18.470278 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091"} Nov 25 15:35:18 crc kubenswrapper[4704]: I1125 15:35:18.470318 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:18 crc kubenswrapper[4704]: I1125 15:35:18.470381 4704 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 15:35:18 crc kubenswrapper[4704]: I1125 15:35:18.470429 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:18 crc kubenswrapper[4704]: I1125 15:35:18.470429 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:18 crc kubenswrapper[4704]: I1125 15:35:18.471359 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:18 crc kubenswrapper[4704]: I1125 15:35:18.471382 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:18 crc kubenswrapper[4704]: I1125 15:35:18.471392 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:18 crc kubenswrapper[4704]: I1125 15:35:18.471618 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:18 crc kubenswrapper[4704]: I1125 15:35:18.471647 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:18 crc kubenswrapper[4704]: I1125 15:35:18.471660 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:18 crc kubenswrapper[4704]: I1125 15:35:18.471875 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:18 crc kubenswrapper[4704]: I1125 15:35:18.471949 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:18 crc kubenswrapper[4704]: I1125 15:35:18.471965 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:19 crc kubenswrapper[4704]: I1125 15:35:19.342015 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:35:19 crc kubenswrapper[4704]: I1125 15:35:19.477726 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede"} Nov 25 15:35:19 crc kubenswrapper[4704]: I1125 15:35:19.477833 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:19 crc kubenswrapper[4704]: I1125 15:35:19.477841 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707"} Nov 25 15:35:19 crc kubenswrapper[4704]: I1125 15:35:19.477925 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:19 crc kubenswrapper[4704]: I1125 15:35:19.477969 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b"} Nov 25 15:35:19 crc kubenswrapper[4704]: I1125 15:35:19.478012 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a"} Nov 25 15:35:19 crc kubenswrapper[4704]: I1125 15:35:19.478025 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384"} Nov 25 15:35:19 crc kubenswrapper[4704]: I1125 15:35:19.478048 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:35:19 crc kubenswrapper[4704]: I1125 15:35:19.477847 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:19 crc kubenswrapper[4704]: I1125 15:35:19.479008 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:19 crc kubenswrapper[4704]: I1125 15:35:19.479036 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:19 crc kubenswrapper[4704]: I1125 15:35:19.479009 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:19 crc kubenswrapper[4704]: I1125 15:35:19.479047 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:19 crc kubenswrapper[4704]: I1125 15:35:19.479053 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:19 crc kubenswrapper[4704]: I1125 15:35:19.479078 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:19 crc kubenswrapper[4704]: I1125 15:35:19.479089 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:19 crc kubenswrapper[4704]: I1125 15:35:19.479059 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:19 crc kubenswrapper[4704]: I1125 15:35:19.479128 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:19 crc kubenswrapper[4704]: I1125 15:35:19.848705 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Nov 25 15:35:20 crc kubenswrapper[4704]: I1125 15:35:20.393660 4704 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Nov 25 15:35:20 crc kubenswrapper[4704]: I1125 15:35:20.480592 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:20 crc kubenswrapper[4704]: I1125 15:35:20.480597 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:20 crc kubenswrapper[4704]: I1125 15:35:20.481978 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:20 crc kubenswrapper[4704]: I1125 15:35:20.482038 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:20 crc kubenswrapper[4704]: I1125 15:35:20.482047 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:20 crc kubenswrapper[4704]: I1125 15:35:20.482141 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:20 crc kubenswrapper[4704]: I1125 15:35:20.482187 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:20 crc kubenswrapper[4704]: I1125 15:35:20.482202 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:20 crc kubenswrapper[4704]: I1125 15:35:20.794431 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:20 crc kubenswrapper[4704]: I1125 15:35:20.796657 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:20 crc kubenswrapper[4704]: I1125 15:35:20.796723 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:20 crc kubenswrapper[4704]: I1125 15:35:20.796739 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:20 crc kubenswrapper[4704]: I1125 15:35:20.796779 4704 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 15:35:21 crc kubenswrapper[4704]: I1125 15:35:21.161563 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:35:21 crc kubenswrapper[4704]: I1125 15:35:21.161828 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:21 crc kubenswrapper[4704]: I1125 15:35:21.163347 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:21 crc kubenswrapper[4704]: I1125 15:35:21.163417 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:21 crc kubenswrapper[4704]: I1125 15:35:21.163437 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:21 crc kubenswrapper[4704]: I1125 15:35:21.169033 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:35:21 crc kubenswrapper[4704]: I1125 15:35:21.483508 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:21 crc kubenswrapper[4704]: I1125 15:35:21.483510 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:21 crc kubenswrapper[4704]: I1125 15:35:21.484696 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:21 crc kubenswrapper[4704]: I1125 15:35:21.484740 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:21 crc kubenswrapper[4704]: I1125 15:35:21.484760 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:21 crc kubenswrapper[4704]: I1125 15:35:21.484883 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:21 crc kubenswrapper[4704]: I1125 15:35:21.484938 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:21 crc kubenswrapper[4704]: I1125 15:35:21.484959 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:21 crc kubenswrapper[4704]: I1125 15:35:21.989315 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:35:22 crc kubenswrapper[4704]: I1125 15:35:22.456308 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:35:22 crc kubenswrapper[4704]: I1125 15:35:22.456531 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:22 crc kubenswrapper[4704]: I1125 15:35:22.458031 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:22 crc kubenswrapper[4704]: I1125 15:35:22.458068 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:22 crc kubenswrapper[4704]: I1125 15:35:22.458077 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:22 crc kubenswrapper[4704]: I1125 15:35:22.485717 4704 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 15:35:22 crc kubenswrapper[4704]: I1125 15:35:22.485769 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:22 crc kubenswrapper[4704]: I1125 15:35:22.486706 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:22 crc kubenswrapper[4704]: I1125 15:35:22.486734 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:22 crc kubenswrapper[4704]: I1125 15:35:22.486743 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:22 crc kubenswrapper[4704]: I1125 15:35:22.837053 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:35:22 crc kubenswrapper[4704]: I1125 15:35:22.837233 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:22 crc kubenswrapper[4704]: I1125 15:35:22.838434 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:22 crc kubenswrapper[4704]: I1125 15:35:22.838508 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:22 crc kubenswrapper[4704]: I1125 15:35:22.838522 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:23 crc kubenswrapper[4704]: I1125 15:35:23.944985 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:35:23 crc kubenswrapper[4704]: I1125 15:35:23.945240 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:23 crc kubenswrapper[4704]: I1125 15:35:23.947131 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:23 crc kubenswrapper[4704]: I1125 15:35:23.947238 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:23 crc kubenswrapper[4704]: I1125 15:35:23.947354 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:24 crc kubenswrapper[4704]: I1125 15:35:24.285420 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:35:24 crc kubenswrapper[4704]: E1125 15:35:24.488352 4704 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 25 15:35:24 crc kubenswrapper[4704]: I1125 15:35:24.491820 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:24 crc kubenswrapper[4704]: I1125 15:35:24.492705 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:24 crc kubenswrapper[4704]: I1125 15:35:24.492743 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:24 crc kubenswrapper[4704]: I1125 15:35:24.492756 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:24 crc kubenswrapper[4704]: I1125 15:35:24.989582 4704 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 25 15:35:24 crc kubenswrapper[4704]: I1125 15:35:24.989717 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 25 15:35:26 crc kubenswrapper[4704]: I1125 15:35:26.673378 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Nov 25 15:35:26 crc kubenswrapper[4704]: I1125 15:35:26.673945 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:26 crc kubenswrapper[4704]: I1125 15:35:26.675012 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:26 crc kubenswrapper[4704]: I1125 15:35:26.675048 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:26 crc kubenswrapper[4704]: I1125 15:35:26.675062 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:28 crc kubenswrapper[4704]: I1125 15:35:28.358121 4704 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Nov 25 15:35:28 crc kubenswrapper[4704]: I1125 15:35:28.447238 4704 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 25 15:35:28 crc kubenswrapper[4704]: I1125 15:35:28.447318 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 25 15:35:28 crc kubenswrapper[4704]: I1125 15:35:28.454666 4704 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 25 15:35:28 crc kubenswrapper[4704]: I1125 15:35:28.454752 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 25 15:35:28 crc kubenswrapper[4704]: I1125 15:35:28.505160 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 25 15:35:28 crc kubenswrapper[4704]: I1125 15:35:28.507267 4704 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091" exitCode=255 Nov 25 15:35:28 crc kubenswrapper[4704]: I1125 15:35:28.507326 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091"} Nov 25 15:35:28 crc kubenswrapper[4704]: I1125 15:35:28.507504 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:28 crc kubenswrapper[4704]: I1125 15:35:28.508760 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:28 crc kubenswrapper[4704]: I1125 15:35:28.509181 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:28 crc kubenswrapper[4704]: I1125 15:35:28.509289 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:28 crc kubenswrapper[4704]: I1125 15:35:28.510114 4704 scope.go:117] "RemoveContainer" containerID="0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091" Nov 25 15:35:29 crc kubenswrapper[4704]: I1125 15:35:29.512382 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 25 15:35:29 crc kubenswrapper[4704]: I1125 15:35:29.515241 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743"} Nov 25 15:35:29 crc kubenswrapper[4704]: I1125 15:35:29.515432 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:29 crc kubenswrapper[4704]: I1125 15:35:29.516286 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:29 crc kubenswrapper[4704]: I1125 15:35:29.516318 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:29 crc kubenswrapper[4704]: I1125 15:35:29.516327 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:32 crc kubenswrapper[4704]: I1125 15:35:32.841703 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:35:32 crc kubenswrapper[4704]: I1125 15:35:32.841934 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:32 crc kubenswrapper[4704]: I1125 15:35:32.842063 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:35:32 crc kubenswrapper[4704]: I1125 15:35:32.843069 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:32 crc kubenswrapper[4704]: I1125 15:35:32.843099 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:32 crc kubenswrapper[4704]: I1125 15:35:32.843107 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:32 crc kubenswrapper[4704]: I1125 15:35:32.845474 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:35:33 crc kubenswrapper[4704]: E1125 15:35:33.430728 4704 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Nov 25 15:35:33 crc kubenswrapper[4704]: I1125 15:35:33.432607 4704 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 25 15:35:33 crc kubenswrapper[4704]: I1125 15:35:33.433557 4704 trace.go:236] Trace[1863844314]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (25-Nov-2025 15:35:22.757) (total time: 10675ms): Nov 25 15:35:33 crc kubenswrapper[4704]: Trace[1863844314]: ---"Objects listed" error: 10675ms (15:35:33.433) Nov 25 15:35:33 crc kubenswrapper[4704]: Trace[1863844314]: [10.675593338s] [10.675593338s] END Nov 25 15:35:33 crc kubenswrapper[4704]: I1125 15:35:33.433596 4704 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 25 15:35:33 crc kubenswrapper[4704]: I1125 15:35:33.434017 4704 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 25 15:35:33 crc kubenswrapper[4704]: I1125 15:35:33.434742 4704 trace.go:236] Trace[1382343992]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (25-Nov-2025 15:35:21.591) (total time: 11843ms): Nov 25 15:35:33 crc kubenswrapper[4704]: Trace[1382343992]: ---"Objects listed" error: 11843ms (15:35:33.434) Nov 25 15:35:33 crc kubenswrapper[4704]: Trace[1382343992]: [11.843636968s] [11.843636968s] END Nov 25 15:35:33 crc kubenswrapper[4704]: I1125 15:35:33.435006 4704 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 25 15:35:33 crc kubenswrapper[4704]: E1125 15:35:33.437061 4704 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Nov 25 15:35:33 crc kubenswrapper[4704]: I1125 15:35:33.439622 4704 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Nov 25 15:35:33 crc kubenswrapper[4704]: I1125 15:35:33.449222 4704 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Nov 25 15:35:33 crc kubenswrapper[4704]: I1125 15:35:33.491575 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:35:33 crc kubenswrapper[4704]: I1125 15:35:33.496279 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:35:33 crc kubenswrapper[4704]: I1125 15:35:33.498201 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:35:33 crc kubenswrapper[4704]: E1125 15:35:33.534453 4704 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.353822 4704 apiserver.go:52] "Watching apiserver" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.357901 4704 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.358165 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.358503 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.359027 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.359052 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.359038 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:35:34 crc kubenswrapper[4704]: E1125 15:35:34.359258 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:35:34 crc kubenswrapper[4704]: E1125 15:35:34.359286 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.359542 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.359581 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 15:35:34 crc kubenswrapper[4704]: E1125 15:35:34.359581 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.359878 4704 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.360565 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.360881 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.361202 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.361313 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.361511 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.361618 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.361963 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.361997 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.362211 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.382191 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.393701 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.405940 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.426571 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.439142 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.449902 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.454206 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.454335 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.454416 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.454499 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.454601 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.454713 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.455122 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.455504 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456296 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456330 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456351 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456370 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456409 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456426 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456443 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456460 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456477 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456494 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456509 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456523 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456537 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456552 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456567 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456584 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456598 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456613 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456629 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456645 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456659 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456674 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456689 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456705 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456721 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456737 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456753 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456768 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456782 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456815 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456831 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456849 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456867 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456883 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456898 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456913 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456930 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456945 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456962 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.456979 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457001 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457018 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457034 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457050 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457074 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457099 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457118 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457138 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457154 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457169 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457187 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457203 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457256 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457273 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457290 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457309 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457325 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457340 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457356 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457372 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.454646 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457390 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457406 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457424 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457441 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457524 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457562 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457592 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457619 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457643 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457666 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457689 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457713 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457738 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457762 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457809 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457836 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457860 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457883 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457910 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457992 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458019 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458040 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458064 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458085 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458107 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458129 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458152 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458173 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458198 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458287 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458318 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458340 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458363 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458386 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458409 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458467 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458492 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458518 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458568 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458595 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458620 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458642 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458667 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458693 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458720 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458744 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458767 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458813 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458837 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458865 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458890 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458913 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458934 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458956 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458983 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459009 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459033 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459059 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459084 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459110 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459134 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459160 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459185 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459210 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459235 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459263 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459333 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459360 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459386 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459419 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459441 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459467 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459490 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459519 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459541 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457387 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457487 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.454896 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.455060 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.455448 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.455617 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.455627 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457053 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457177 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457261 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459670 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457288 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457367 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457562 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457593 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457664 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.457944 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458046 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458176 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458175 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458200 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458247 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458281 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458462 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458660 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458652 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458804 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458847 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458936 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458973 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.458981 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459037 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459184 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459377 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459401 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459414 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459471 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459488 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459536 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459728 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459923 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459984 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.460004 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.460021 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.460192 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.460203 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.460408 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.460599 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.454806 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461142 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.459566 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461403 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461417 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461430 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461460 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461484 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461508 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461530 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461550 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461570 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461591 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461610 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461630 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461656 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461674 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461691 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461707 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461723 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461740 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461759 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461778 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461822 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461846 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461865 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461888 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461909 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461926 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461945 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461966 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461991 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462011 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462034 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462058 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462082 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462100 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462149 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462170 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462186 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462204 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462248 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462265 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462286 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462304 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462372 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462393 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462411 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462430 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462447 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462467 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462484 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462502 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462521 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462538 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462577 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462599 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462622 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462638 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462656 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462675 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462695 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462713 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462733 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462756 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462776 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462810 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462828 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462847 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462902 4704 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462944 4704 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462956 4704 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462966 4704 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462975 4704 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462988 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462998 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463008 4704 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463018 4704 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463028 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463038 4704 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463048 4704 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463058 4704 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463067 4704 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463077 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463087 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463097 4704 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463109 4704 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463121 4704 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463132 4704 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463145 4704 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463154 4704 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463163 4704 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463173 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463183 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463194 4704 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463203 4704 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463212 4704 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463220 4704 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463230 4704 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463240 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463249 4704 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463258 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463268 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463279 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463288 4704 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463297 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463308 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463318 4704 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463329 4704 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463341 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463350 4704 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463362 4704 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463371 4704 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463380 4704 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463391 4704 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463401 4704 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463411 4704 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463421 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461457 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461479 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.461868 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462502 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462520 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462562 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462728 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462903 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462910 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.462935 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463179 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463210 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463569 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463544 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.464823 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463635 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.463694 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.464212 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.464352 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.464433 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.464541 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.464521 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.464668 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.464575 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.464752 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.464968 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.465093 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.465178 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: E1125 15:35:34.465268 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:35:34.965244073 +0000 UTC m=+21.233517854 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.465547 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.465570 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.465590 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.465632 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.465685 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.465890 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.465989 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.466017 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.466217 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.466714 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.466716 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.467058 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.467141 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.467105 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.467130 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.467843 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.468011 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.468057 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.468304 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.468316 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.468366 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.468606 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.468845 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.468966 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.469178 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.469182 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.469274 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.469581 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.469659 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.469812 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.470002 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.470067 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: E1125 15:35:34.470095 4704 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.470218 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: E1125 15:35:34.470235 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:35:34.970180776 +0000 UTC m=+21.238454747 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.470175 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.470254 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.470411 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.470481 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.470541 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.470660 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.470913 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.471093 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.471281 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.471368 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.472152 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.471732 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.472012 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.472373 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.472494 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.472985 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.473076 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.473445 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.473566 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.473636 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.473651 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.473832 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.473895 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.474111 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.474177 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.474196 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.474437 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.474466 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.474626 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.475086 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.475220 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.475636 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.475780 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.475898 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.476144 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.476235 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.476245 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.476395 4704 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.476611 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.476736 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.476748 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.476831 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.476887 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.477000 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: E1125 15:35:34.477518 4704 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:35:34 crc kubenswrapper[4704]: E1125 15:35:34.477581 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:35:34.977564015 +0000 UTC m=+21.245837796 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.479169 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.480055 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.480814 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.480831 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.480970 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.481419 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.490729 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.492204 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.492968 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.494274 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.494353 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.494387 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.494462 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:35:34 crc kubenswrapper[4704]: E1125 15:35:34.493342 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:35:34 crc kubenswrapper[4704]: E1125 15:35:34.494941 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:35:34 crc kubenswrapper[4704]: E1125 15:35:34.494968 4704 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.494985 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.495048 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.490877 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: E1125 15:35:34.491080 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:35:34 crc kubenswrapper[4704]: E1125 15:35:34.495136 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:35:34 crc kubenswrapper[4704]: E1125 15:35:34.495153 4704 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.491345 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.491360 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.491370 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.491531 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.493153 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.493869 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: E1125 15:35:34.495085 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 15:35:34.995062049 +0000 UTC m=+21.263335830 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:35:34 crc kubenswrapper[4704]: E1125 15:35:34.495347 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 15:35:34.995302806 +0000 UTC m=+21.263576787 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.495414 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.496019 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.497962 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.498169 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.498635 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.498928 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.500268 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.500481 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.500838 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.501194 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.503557 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.503773 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.503831 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.504828 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.506112 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.506515 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.506809 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.506860 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.507111 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.508089 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.508167 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.508294 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.517362 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.519655 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.528338 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.532172 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.535946 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: E1125 15:35:34.539896 4704 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.544251 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.545661 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.555314 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.563892 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.563948 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564001 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564012 4704 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564021 4704 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564031 4704 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564040 4704 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564049 4704 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564061 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564070 4704 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564079 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564088 4704 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564096 4704 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564104 4704 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564112 4704 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564123 4704 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564156 4704 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564165 4704 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564174 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564182 4704 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564191 4704 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564200 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564198 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564209 4704 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564288 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564303 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564317 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564329 4704 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564341 4704 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564354 4704 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564366 4704 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565159 4704 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565179 4704 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565189 4704 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565200 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565210 4704 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565220 4704 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565229 4704 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565237 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565246 4704 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565255 4704 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565264 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.564031 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565273 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565340 4704 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565355 4704 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565368 4704 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565377 4704 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565388 4704 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565412 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565421 4704 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565431 4704 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565440 4704 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565448 4704 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565457 4704 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565466 4704 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565490 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565498 4704 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565507 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565517 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565525 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565533 4704 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565541 4704 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565565 4704 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565574 4704 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565582 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565590 4704 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565598 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565608 4704 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565616 4704 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565624 4704 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565649 4704 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565658 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565665 4704 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565674 4704 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565683 4704 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565691 4704 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565699 4704 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565725 4704 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565738 4704 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565748 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565758 4704 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565587 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565770 4704 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565955 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565966 4704 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565975 4704 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565984 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.565994 4704 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566002 4704 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566012 4704 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566022 4704 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566031 4704 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566040 4704 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566051 4704 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566060 4704 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566069 4704 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566079 4704 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566090 4704 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566100 4704 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566119 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566128 4704 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566137 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566146 4704 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566157 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566167 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566177 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566186 4704 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566197 4704 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566206 4704 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566214 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566225 4704 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566233 4704 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566242 4704 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566251 4704 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566260 4704 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566268 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566277 4704 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566286 4704 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566294 4704 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566304 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566314 4704 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566322 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566330 4704 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566338 4704 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566347 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566356 4704 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566366 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566375 4704 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566383 4704 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566391 4704 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566400 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566408 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566416 4704 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566424 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566432 4704 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566442 4704 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566454 4704 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566465 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566476 4704 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566486 4704 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566496 4704 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566506 4704 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566520 4704 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566530 4704 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566540 4704 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566553 4704 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566563 4704 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566575 4704 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.566588 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.575740 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.586031 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.593839 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.600691 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.674178 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.681549 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.688056 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 15:35:34 crc kubenswrapper[4704]: W1125 15:35:34.692337 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-deafd7ee7df0ac66640a0d4a22892973be65976d242b882d8bbb8ecd6a1efc0d WatchSource:0}: Error finding container deafd7ee7df0ac66640a0d4a22892973be65976d242b882d8bbb8ecd6a1efc0d: Status 404 returned error can't find the container with id deafd7ee7df0ac66640a0d4a22892973be65976d242b882d8bbb8ecd6a1efc0d Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.977739 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:35:34 crc kubenswrapper[4704]: E1125 15:35:34.978077 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:35:35.977995108 +0000 UTC m=+22.246268939 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.978679 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:35:34 crc kubenswrapper[4704]: I1125 15:35:34.979098 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:35:34 crc kubenswrapper[4704]: E1125 15:35:34.978905 4704 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:35:34 crc kubenswrapper[4704]: E1125 15:35:34.979226 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:35:35.979187185 +0000 UTC m=+22.247460966 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:35:34 crc kubenswrapper[4704]: E1125 15:35:34.979465 4704 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:35:34 crc kubenswrapper[4704]: E1125 15:35:34.979575 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:35:35.979552226 +0000 UTC m=+22.247826007 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:35:35 crc kubenswrapper[4704]: I1125 15:35:35.080708 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:35:35 crc kubenswrapper[4704]: I1125 15:35:35.080861 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:35:35 crc kubenswrapper[4704]: E1125 15:35:35.081063 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:35:35 crc kubenswrapper[4704]: E1125 15:35:35.081089 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:35:35 crc kubenswrapper[4704]: E1125 15:35:35.081384 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:35:35 crc kubenswrapper[4704]: E1125 15:35:35.081400 4704 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:35:35 crc kubenswrapper[4704]: E1125 15:35:35.081322 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:35:35 crc kubenswrapper[4704]: E1125 15:35:35.081490 4704 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:35:35 crc kubenswrapper[4704]: E1125 15:35:35.081469 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 15:35:36.081448111 +0000 UTC m=+22.349722142 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:35:35 crc kubenswrapper[4704]: E1125 15:35:35.081575 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 15:35:36.081555814 +0000 UTC m=+22.349829605 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:35:35 crc kubenswrapper[4704]: I1125 15:35:35.416478 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:35:35 crc kubenswrapper[4704]: E1125 15:35:35.416666 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:35:35 crc kubenswrapper[4704]: I1125 15:35:35.537306 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"a7b9061345b3261b19d098b88924b918e220c28b4b098c8048adc99e40e91896"} Nov 25 15:35:35 crc kubenswrapper[4704]: I1125 15:35:35.539875 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71"} Nov 25 15:35:35 crc kubenswrapper[4704]: I1125 15:35:35.539914 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1"} Nov 25 15:35:35 crc kubenswrapper[4704]: I1125 15:35:35.539929 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"fbed178212ada8583ab293cb274820df55ae2d8f2fa996807ca717e565ff6406"} Nov 25 15:35:35 crc kubenswrapper[4704]: I1125 15:35:35.541350 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7"} Nov 25 15:35:35 crc kubenswrapper[4704]: I1125 15:35:35.541416 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"deafd7ee7df0ac66640a0d4a22892973be65976d242b882d8bbb8ecd6a1efc0d"} Nov 25 15:35:35 crc kubenswrapper[4704]: I1125 15:35:35.558828 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:35Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:35 crc kubenswrapper[4704]: I1125 15:35:35.575015 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:35Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:35 crc kubenswrapper[4704]: I1125 15:35:35.592107 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:35Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:35 crc kubenswrapper[4704]: I1125 15:35:35.607179 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:35Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:35 crc kubenswrapper[4704]: I1125 15:35:35.620445 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:35Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:35 crc kubenswrapper[4704]: I1125 15:35:35.637389 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:35Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:35 crc kubenswrapper[4704]: I1125 15:35:35.655961 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:35Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:35 crc kubenswrapper[4704]: I1125 15:35:35.671989 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:35Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:35 crc kubenswrapper[4704]: I1125 15:35:35.687581 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:35Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:35 crc kubenswrapper[4704]: I1125 15:35:35.703951 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:35Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:35 crc kubenswrapper[4704]: I1125 15:35:35.723726 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:35Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:35 crc kubenswrapper[4704]: I1125 15:35:35.738326 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:35Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:35 crc kubenswrapper[4704]: I1125 15:35:35.751002 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:35Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:35 crc kubenswrapper[4704]: I1125 15:35:35.765876 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:35Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:35 crc kubenswrapper[4704]: I1125 15:35:35.777398 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:35Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:35 crc kubenswrapper[4704]: I1125 15:35:35.789635 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:35Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:35 crc kubenswrapper[4704]: I1125 15:35:35.989851 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:35:35 crc kubenswrapper[4704]: I1125 15:35:35.989939 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:35:35 crc kubenswrapper[4704]: E1125 15:35:35.989992 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:35:37.989968589 +0000 UTC m=+24.258242380 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:35:35 crc kubenswrapper[4704]: I1125 15:35:35.990024 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:35:35 crc kubenswrapper[4704]: E1125 15:35:35.990051 4704 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:35:35 crc kubenswrapper[4704]: E1125 15:35:35.990090 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:35:37.990082353 +0000 UTC m=+24.258356134 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:35:35 crc kubenswrapper[4704]: E1125 15:35:35.990099 4704 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:35:35 crc kubenswrapper[4704]: E1125 15:35:35.990131 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:35:37.990123104 +0000 UTC m=+24.258396885 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.091096 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.091170 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:35:36 crc kubenswrapper[4704]: E1125 15:35:36.091318 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:35:36 crc kubenswrapper[4704]: E1125 15:35:36.091341 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:35:36 crc kubenswrapper[4704]: E1125 15:35:36.091354 4704 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:35:36 crc kubenswrapper[4704]: E1125 15:35:36.091357 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:35:36 crc kubenswrapper[4704]: E1125 15:35:36.091413 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:35:36 crc kubenswrapper[4704]: E1125 15:35:36.091455 4704 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:35:36 crc kubenswrapper[4704]: E1125 15:35:36.091414 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 15:35:38.09139742 +0000 UTC m=+24.359671201 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:35:36 crc kubenswrapper[4704]: E1125 15:35:36.091549 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 15:35:38.091532404 +0000 UTC m=+24.359806185 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.416339 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.416421 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:35:36 crc kubenswrapper[4704]: E1125 15:35:36.416505 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:35:36 crc kubenswrapper[4704]: E1125 15:35:36.416922 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.420865 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.422002 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.423829 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.424832 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.426197 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.426816 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.427938 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.429471 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.430230 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.431067 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.431649 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.432409 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.433170 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.433833 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.434371 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.434998 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.435731 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.436248 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.436962 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.437613 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.438268 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.438996 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.439498 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.441831 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.442721 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.443716 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.444578 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.445193 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.445898 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.446540 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.447188 4704 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.447317 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.448949 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.449535 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.450049 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.451469 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.454030 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.454819 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.455959 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.456710 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.457217 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.457868 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.458493 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.459234 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.459759 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.460366 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.460929 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.461716 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.462237 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.462687 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.463262 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.466953 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.467563 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.468093 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.701352 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.718649 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.719248 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.719386 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.737873 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.752231 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.765552 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.779370 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.793358 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.810089 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.826321 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.848454 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.864091 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.878952 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.894980 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.909211 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.928101 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.943566 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.960646 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:36 crc kubenswrapper[4704]: I1125 15:35:36.974581 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.416332 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:35:37 crc kubenswrapper[4704]: E1125 15:35:37.416467 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.547945 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72"} Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.552239 4704 csr.go:261] certificate signing request csr-66kwp is approved, waiting to be issued Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.566757 4704 csr.go:257] certificate signing request csr-66kwp is issued Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.568139 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.578763 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-8hrtj"] Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.579175 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8hrtj" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.581048 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.581135 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.581760 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.593551 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.603374 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/709e7197-d9e5-4981-b4a3-249323be907c-hosts-file\") pod \"node-resolver-8hrtj\" (UID: \"709e7197-d9e5-4981-b4a3-249323be907c\") " pod="openshift-dns/node-resolver-8hrtj" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.603467 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xswmf\" (UniqueName: \"kubernetes.io/projected/709e7197-d9e5-4981-b4a3-249323be907c-kube-api-access-xswmf\") pod \"node-resolver-8hrtj\" (UID: \"709e7197-d9e5-4981-b4a3-249323be907c\") " pod="openshift-dns/node-resolver-8hrtj" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.617946 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.633813 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.648259 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.665508 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.669608 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-9fq7f"] Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.669968 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-djz8x"] Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.670149 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9fq7f" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.670793 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.680982 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.681087 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.681196 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.681221 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.681219 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.681958 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.682394 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.682592 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.683369 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.691168 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.704266 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xswmf\" (UniqueName: \"kubernetes.io/projected/709e7197-d9e5-4981-b4a3-249323be907c-kube-api-access-xswmf\") pod \"node-resolver-8hrtj\" (UID: \"709e7197-d9e5-4981-b4a3-249323be907c\") " pod="openshift-dns/node-resolver-8hrtj" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.704323 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9r8k\" (UniqueName: \"kubernetes.io/projected/55051ec6-6c32-4004-8f1a-3069433c80cf-kube-api-access-t9r8k\") pod \"node-ca-9fq7f\" (UID: \"55051ec6-6c32-4004-8f1a-3069433c80cf\") " pod="openshift-image-registry/node-ca-9fq7f" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.704353 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfvv9\" (UniqueName: \"kubernetes.io/projected/91b52682-d008-4b8a-8bc3-26b032d7dc2c-kube-api-access-jfvv9\") pod \"machine-config-daemon-djz8x\" (UID: \"91b52682-d008-4b8a-8bc3-26b032d7dc2c\") " pod="openshift-machine-config-operator/machine-config-daemon-djz8x" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.704376 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/55051ec6-6c32-4004-8f1a-3069433c80cf-host\") pod \"node-ca-9fq7f\" (UID: \"55051ec6-6c32-4004-8f1a-3069433c80cf\") " pod="openshift-image-registry/node-ca-9fq7f" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.704409 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/91b52682-d008-4b8a-8bc3-26b032d7dc2c-rootfs\") pod \"machine-config-daemon-djz8x\" (UID: \"91b52682-d008-4b8a-8bc3-26b032d7dc2c\") " pod="openshift-machine-config-operator/machine-config-daemon-djz8x" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.704456 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/709e7197-d9e5-4981-b4a3-249323be907c-hosts-file\") pod \"node-resolver-8hrtj\" (UID: \"709e7197-d9e5-4981-b4a3-249323be907c\") " pod="openshift-dns/node-resolver-8hrtj" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.704563 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/55051ec6-6c32-4004-8f1a-3069433c80cf-serviceca\") pod \"node-ca-9fq7f\" (UID: \"55051ec6-6c32-4004-8f1a-3069433c80cf\") " pod="openshift-image-registry/node-ca-9fq7f" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.704612 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/91b52682-d008-4b8a-8bc3-26b032d7dc2c-mcd-auth-proxy-config\") pod \"machine-config-daemon-djz8x\" (UID: \"91b52682-d008-4b8a-8bc3-26b032d7dc2c\") " pod="openshift-machine-config-operator/machine-config-daemon-djz8x" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.704626 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/709e7197-d9e5-4981-b4a3-249323be907c-hosts-file\") pod \"node-resolver-8hrtj\" (UID: \"709e7197-d9e5-4981-b4a3-249323be907c\") " pod="openshift-dns/node-resolver-8hrtj" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.704698 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/91b52682-d008-4b8a-8bc3-26b032d7dc2c-proxy-tls\") pod \"machine-config-daemon-djz8x\" (UID: \"91b52682-d008-4b8a-8bc3-26b032d7dc2c\") " pod="openshift-machine-config-operator/machine-config-daemon-djz8x" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.709471 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.723530 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xswmf\" (UniqueName: \"kubernetes.io/projected/709e7197-d9e5-4981-b4a3-249323be907c-kube-api-access-xswmf\") pod \"node-resolver-8hrtj\" (UID: \"709e7197-d9e5-4981-b4a3-249323be907c\") " pod="openshift-dns/node-resolver-8hrtj" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.726690 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.748650 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.781950 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.806042 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/91b52682-d008-4b8a-8bc3-26b032d7dc2c-rootfs\") pod \"machine-config-daemon-djz8x\" (UID: \"91b52682-d008-4b8a-8bc3-26b032d7dc2c\") " pod="openshift-machine-config-operator/machine-config-daemon-djz8x" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.806100 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/55051ec6-6c32-4004-8f1a-3069433c80cf-serviceca\") pod \"node-ca-9fq7f\" (UID: \"55051ec6-6c32-4004-8f1a-3069433c80cf\") " pod="openshift-image-registry/node-ca-9fq7f" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.806138 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/91b52682-d008-4b8a-8bc3-26b032d7dc2c-mcd-auth-proxy-config\") pod \"machine-config-daemon-djz8x\" (UID: \"91b52682-d008-4b8a-8bc3-26b032d7dc2c\") " pod="openshift-machine-config-operator/machine-config-daemon-djz8x" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.806170 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/91b52682-d008-4b8a-8bc3-26b032d7dc2c-proxy-tls\") pod \"machine-config-daemon-djz8x\" (UID: \"91b52682-d008-4b8a-8bc3-26b032d7dc2c\") " pod="openshift-machine-config-operator/machine-config-daemon-djz8x" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.806189 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9r8k\" (UniqueName: \"kubernetes.io/projected/55051ec6-6c32-4004-8f1a-3069433c80cf-kube-api-access-t9r8k\") pod \"node-ca-9fq7f\" (UID: \"55051ec6-6c32-4004-8f1a-3069433c80cf\") " pod="openshift-image-registry/node-ca-9fq7f" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.806204 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfvv9\" (UniqueName: \"kubernetes.io/projected/91b52682-d008-4b8a-8bc3-26b032d7dc2c-kube-api-access-jfvv9\") pod \"machine-config-daemon-djz8x\" (UID: \"91b52682-d008-4b8a-8bc3-26b032d7dc2c\") " pod="openshift-machine-config-operator/machine-config-daemon-djz8x" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.806227 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/55051ec6-6c32-4004-8f1a-3069433c80cf-host\") pod \"node-ca-9fq7f\" (UID: \"55051ec6-6c32-4004-8f1a-3069433c80cf\") " pod="openshift-image-registry/node-ca-9fq7f" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.806220 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/91b52682-d008-4b8a-8bc3-26b032d7dc2c-rootfs\") pod \"machine-config-daemon-djz8x\" (UID: \"91b52682-d008-4b8a-8bc3-26b032d7dc2c\") " pod="openshift-machine-config-operator/machine-config-daemon-djz8x" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.806293 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/55051ec6-6c32-4004-8f1a-3069433c80cf-host\") pod \"node-ca-9fq7f\" (UID: \"55051ec6-6c32-4004-8f1a-3069433c80cf\") " pod="openshift-image-registry/node-ca-9fq7f" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.807774 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/55051ec6-6c32-4004-8f1a-3069433c80cf-serviceca\") pod \"node-ca-9fq7f\" (UID: \"55051ec6-6c32-4004-8f1a-3069433c80cf\") " pod="openshift-image-registry/node-ca-9fq7f" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.807870 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/91b52682-d008-4b8a-8bc3-26b032d7dc2c-mcd-auth-proxy-config\") pod \"machine-config-daemon-djz8x\" (UID: \"91b52682-d008-4b8a-8bc3-26b032d7dc2c\") " pod="openshift-machine-config-operator/machine-config-daemon-djz8x" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.811162 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/91b52682-d008-4b8a-8bc3-26b032d7dc2c-proxy-tls\") pod \"machine-config-daemon-djz8x\" (UID: \"91b52682-d008-4b8a-8bc3-26b032d7dc2c\") " pod="openshift-machine-config-operator/machine-config-daemon-djz8x" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.819046 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.836274 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfvv9\" (UniqueName: \"kubernetes.io/projected/91b52682-d008-4b8a-8bc3-26b032d7dc2c-kube-api-access-jfvv9\") pod \"machine-config-daemon-djz8x\" (UID: \"91b52682-d008-4b8a-8bc3-26b032d7dc2c\") " pod="openshift-machine-config-operator/machine-config-daemon-djz8x" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.841837 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9r8k\" (UniqueName: \"kubernetes.io/projected/55051ec6-6c32-4004-8f1a-3069433c80cf-kube-api-access-t9r8k\") pod \"node-ca-9fq7f\" (UID: \"55051ec6-6c32-4004-8f1a-3069433c80cf\") " pod="openshift-image-registry/node-ca-9fq7f" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.850189 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.870756 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.892336 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8hrtj" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.899428 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:37 crc kubenswrapper[4704]: W1125 15:35:37.907095 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod709e7197_d9e5_4981_b4a3_249323be907c.slice/crio-b8de84388e4c55f736cb6d78f762009f8e0b09d4b00f5a47284d904146826f24 WatchSource:0}: Error finding container b8de84388e4c55f736cb6d78f762009f8e0b09d4b00f5a47284d904146826f24: Status 404 returned error can't find the container with id b8de84388e4c55f736cb6d78f762009f8e0b09d4b00f5a47284d904146826f24 Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.928471 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.944185 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.958172 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.979981 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.985335 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9fq7f" Nov 25 15:35:37 crc kubenswrapper[4704]: I1125 15:35:37.994154 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.007838 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.007929 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.008169 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:35:38 crc kubenswrapper[4704]: E1125 15:35:38.008308 4704 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:35:38 crc kubenswrapper[4704]: E1125 15:35:38.008331 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:35:42.00829363 +0000 UTC m=+28.276567411 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:35:38 crc kubenswrapper[4704]: E1125 15:35:38.008380 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:35:42.008363322 +0000 UTC m=+28.276637093 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:35:38 crc kubenswrapper[4704]: E1125 15:35:38.008513 4704 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:35:38 crc kubenswrapper[4704]: E1125 15:35:38.008564 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:35:42.008557438 +0000 UTC m=+28.276831219 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.011008 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: W1125 15:35:38.015849 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91b52682_d008_4b8a_8bc3_26b032d7dc2c.slice/crio-ef7e516180659d15189a47eb1a14d17482314891ae5e3eb78814e5ab40d6c365 WatchSource:0}: Error finding container ef7e516180659d15189a47eb1a14d17482314891ae5e3eb78814e5ab40d6c365: Status 404 returned error can't find the container with id ef7e516180659d15189a47eb1a14d17482314891ae5e3eb78814e5ab40d6c365 Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.029209 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.097318 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-l5scf"] Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.098169 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-h92xm"] Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.098429 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.098428 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-l5scf" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.103958 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.103969 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.104976 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.104973 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.106997 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.107230 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.107357 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.108747 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.108822 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:35:38 crc kubenswrapper[4704]: E1125 15:35:38.109157 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:35:38 crc kubenswrapper[4704]: E1125 15:35:38.109177 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:35:38 crc kubenswrapper[4704]: E1125 15:35:38.109192 4704 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:35:38 crc kubenswrapper[4704]: E1125 15:35:38.109242 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 15:35:42.109223428 +0000 UTC m=+28.377497209 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:35:38 crc kubenswrapper[4704]: E1125 15:35:38.109312 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:35:38 crc kubenswrapper[4704]: E1125 15:35:38.109328 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:35:38 crc kubenswrapper[4704]: E1125 15:35:38.109340 4704 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:35:38 crc kubenswrapper[4704]: E1125 15:35:38.109372 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 15:35:42.109363152 +0000 UTC m=+28.377636933 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.129156 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.146464 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.164259 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.182009 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.199722 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.209237 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-os-release\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.209286 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-host-var-lib-cni-multus\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.209312 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-multus-daemon-config\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.209341 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f430d4f1-803f-4dc6-a319-4e0b8836cf1e-os-release\") pod \"multus-additional-cni-plugins-l5scf\" (UID: \"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\") " pod="openshift-multus/multus-additional-cni-plugins-l5scf" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.209370 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-multus-socket-dir-parent\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.209391 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-host-var-lib-kubelet\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.209430 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-cni-binary-copy\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.209453 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-multus-conf-dir\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.209553 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f430d4f1-803f-4dc6-a319-4e0b8836cf1e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l5scf\" (UID: \"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\") " pod="openshift-multus/multus-additional-cni-plugins-l5scf" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.209609 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-multus-cni-dir\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.209629 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-host-run-k8s-cni-cncf-io\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.209647 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-host-var-lib-cni-bin\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.209665 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x8mj\" (UniqueName: \"kubernetes.io/projected/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-kube-api-access-5x8mj\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.209703 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f430d4f1-803f-4dc6-a319-4e0b8836cf1e-cni-binary-copy\") pod \"multus-additional-cni-plugins-l5scf\" (UID: \"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\") " pod="openshift-multus/multus-additional-cni-plugins-l5scf" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.209783 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f430d4f1-803f-4dc6-a319-4e0b8836cf1e-cnibin\") pod \"multus-additional-cni-plugins-l5scf\" (UID: \"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\") " pod="openshift-multus/multus-additional-cni-plugins-l5scf" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.209853 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f430d4f1-803f-4dc6-a319-4e0b8836cf1e-system-cni-dir\") pod \"multus-additional-cni-plugins-l5scf\" (UID: \"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\") " pod="openshift-multus/multus-additional-cni-plugins-l5scf" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.209886 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-system-cni-dir\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.209909 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-hostroot\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.209934 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dk7h\" (UniqueName: \"kubernetes.io/projected/f430d4f1-803f-4dc6-a319-4e0b8836cf1e-kube-api-access-6dk7h\") pod \"multus-additional-cni-plugins-l5scf\" (UID: \"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\") " pod="openshift-multus/multus-additional-cni-plugins-l5scf" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.210176 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-cnibin\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.210214 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-host-run-multus-certs\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.210243 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f430d4f1-803f-4dc6-a319-4e0b8836cf1e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l5scf\" (UID: \"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\") " pod="openshift-multus/multus-additional-cni-plugins-l5scf" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.210279 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-host-run-netns\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.210298 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-etc-kubernetes\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.225642 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.241899 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.265510 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.281537 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.305544 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.310965 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-host-run-netns\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.311015 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-etc-kubernetes\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.311037 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f430d4f1-803f-4dc6-a319-4e0b8836cf1e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l5scf\" (UID: \"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\") " pod="openshift-multus/multus-additional-cni-plugins-l5scf" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.311071 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-os-release\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.311093 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-host-var-lib-cni-multus\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.311113 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-multus-daemon-config\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.311133 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f430d4f1-803f-4dc6-a319-4e0b8836cf1e-os-release\") pod \"multus-additional-cni-plugins-l5scf\" (UID: \"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\") " pod="openshift-multus/multus-additional-cni-plugins-l5scf" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.311154 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-multus-socket-dir-parent\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.311174 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-host-var-lib-kubelet\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.311205 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-multus-conf-dir\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.311227 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f430d4f1-803f-4dc6-a319-4e0b8836cf1e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l5scf\" (UID: \"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\") " pod="openshift-multus/multus-additional-cni-plugins-l5scf" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.311249 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-cni-binary-copy\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.311271 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-multus-cni-dir\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.311309 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-host-run-k8s-cni-cncf-io\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.311342 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-host-var-lib-cni-bin\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.311364 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x8mj\" (UniqueName: \"kubernetes.io/projected/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-kube-api-access-5x8mj\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.311394 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f430d4f1-803f-4dc6-a319-4e0b8836cf1e-cnibin\") pod \"multus-additional-cni-plugins-l5scf\" (UID: \"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\") " pod="openshift-multus/multus-additional-cni-plugins-l5scf" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.311416 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f430d4f1-803f-4dc6-a319-4e0b8836cf1e-cni-binary-copy\") pod \"multus-additional-cni-plugins-l5scf\" (UID: \"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\") " pod="openshift-multus/multus-additional-cni-plugins-l5scf" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.311443 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f430d4f1-803f-4dc6-a319-4e0b8836cf1e-system-cni-dir\") pod \"multus-additional-cni-plugins-l5scf\" (UID: \"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\") " pod="openshift-multus/multus-additional-cni-plugins-l5scf" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.311468 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-system-cni-dir\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.311487 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-hostroot\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.311508 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dk7h\" (UniqueName: \"kubernetes.io/projected/f430d4f1-803f-4dc6-a319-4e0b8836cf1e-kube-api-access-6dk7h\") pod \"multus-additional-cni-plugins-l5scf\" (UID: \"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\") " pod="openshift-multus/multus-additional-cni-plugins-l5scf" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.311538 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-cnibin\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.311559 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-host-run-multus-certs\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.311685 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-host-run-multus-certs\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.311733 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-host-run-netns\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.311762 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-etc-kubernetes\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.311944 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-multus-cni-dir\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.312041 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f430d4f1-803f-4dc6-a319-4e0b8836cf1e-system-cni-dir\") pod \"multus-additional-cni-plugins-l5scf\" (UID: \"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\") " pod="openshift-multus/multus-additional-cni-plugins-l5scf" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.312324 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f430d4f1-803f-4dc6-a319-4e0b8836cf1e-cnibin\") pod \"multus-additional-cni-plugins-l5scf\" (UID: \"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\") " pod="openshift-multus/multus-additional-cni-plugins-l5scf" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.312368 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-host-run-k8s-cni-cncf-io\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.312370 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-host-var-lib-cni-multus\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.312378 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-multus-conf-dir\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.312412 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-host-var-lib-cni-bin\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.312429 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-cnibin\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.312454 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-hostroot\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.312474 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-multus-socket-dir-parent\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.312547 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-os-release\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.312685 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f430d4f1-803f-4dc6-a319-4e0b8836cf1e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l5scf\" (UID: \"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\") " pod="openshift-multus/multus-additional-cni-plugins-l5scf" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.312723 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-system-cni-dir\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.312840 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-host-var-lib-kubelet\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.312906 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f430d4f1-803f-4dc6-a319-4e0b8836cf1e-os-release\") pod \"multus-additional-cni-plugins-l5scf\" (UID: \"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\") " pod="openshift-multus/multus-additional-cni-plugins-l5scf" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.313205 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f430d4f1-803f-4dc6-a319-4e0b8836cf1e-cni-binary-copy\") pod \"multus-additional-cni-plugins-l5scf\" (UID: \"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\") " pod="openshift-multus/multus-additional-cni-plugins-l5scf" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.313268 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f430d4f1-803f-4dc6-a319-4e0b8836cf1e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l5scf\" (UID: \"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\") " pod="openshift-multus/multus-additional-cni-plugins-l5scf" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.313357 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-multus-daemon-config\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.313543 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-cni-binary-copy\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.337672 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dk7h\" (UniqueName: \"kubernetes.io/projected/f430d4f1-803f-4dc6-a319-4e0b8836cf1e-kube-api-access-6dk7h\") pod \"multus-additional-cni-plugins-l5scf\" (UID: \"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\") " pod="openshift-multus/multus-additional-cni-plugins-l5scf" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.340450 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x8mj\" (UniqueName: \"kubernetes.io/projected/d2820ade-e9bd-4146-b275-0c3b7d0cb5aa-kube-api-access-5x8mj\") pod \"multus-h92xm\" (UID: \"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\") " pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.346112 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.384291 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.412494 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.415876 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.415879 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:35:38 crc kubenswrapper[4704]: E1125 15:35:38.416046 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:35:38 crc kubenswrapper[4704]: E1125 15:35:38.416211 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.417102 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-h92xm" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.426994 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-l5scf" Nov 25 15:35:38 crc kubenswrapper[4704]: W1125 15:35:38.429511 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2820ade_e9bd_4146_b275_0c3b7d0cb5aa.slice/crio-200d2dd53473ce60007be3c17afd2f29e9c7cc3f8e33d630406f16a740a6f2be WatchSource:0}: Error finding container 200d2dd53473ce60007be3c17afd2f29e9c7cc3f8e33d630406f16a740a6f2be: Status 404 returned error can't find the container with id 200d2dd53473ce60007be3c17afd2f29e9c7cc3f8e33d630406f16a740a6f2be Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.446528 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: W1125 15:35:38.449756 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf430d4f1_803f_4dc6_a319_4e0b8836cf1e.slice/crio-a57bffc54b9d422e391a9830926ca8fdf7cde392b8a5f136451b93a80b639a1a WatchSource:0}: Error finding container a57bffc54b9d422e391a9830926ca8fdf7cde392b8a5f136451b93a80b639a1a: Status 404 returned error can't find the container with id a57bffc54b9d422e391a9830926ca8fdf7cde392b8a5f136451b93a80b639a1a Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.461782 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.479509 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.498958 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.518116 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.536066 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.550781 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.553045 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" event={"ID":"f430d4f1-803f-4dc6-a319-4e0b8836cf1e","Type":"ContainerStarted","Data":"a57bffc54b9d422e391a9830926ca8fdf7cde392b8a5f136451b93a80b639a1a"} Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.555685 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" event={"ID":"91b52682-d008-4b8a-8bc3-26b032d7dc2c","Type":"ContainerStarted","Data":"f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1"} Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.555723 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" event={"ID":"91b52682-d008-4b8a-8bc3-26b032d7dc2c","Type":"ContainerStarted","Data":"7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675"} Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.555736 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" event={"ID":"91b52682-d008-4b8a-8bc3-26b032d7dc2c","Type":"ContainerStarted","Data":"ef7e516180659d15189a47eb1a14d17482314891ae5e3eb78814e5ab40d6c365"} Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.557925 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8hrtj" event={"ID":"709e7197-d9e5-4981-b4a3-249323be907c","Type":"ContainerStarted","Data":"3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930"} Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.557956 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8hrtj" event={"ID":"709e7197-d9e5-4981-b4a3-249323be907c","Type":"ContainerStarted","Data":"b8de84388e4c55f736cb6d78f762009f8e0b09d4b00f5a47284d904146826f24"} Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.560418 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h92xm" event={"ID":"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa","Type":"ContainerStarted","Data":"d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59"} Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.560443 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h92xm" event={"ID":"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa","Type":"ContainerStarted","Data":"200d2dd53473ce60007be3c17afd2f29e9c7cc3f8e33d630406f16a740a6f2be"} Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.562417 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9fq7f" event={"ID":"55051ec6-6c32-4004-8f1a-3069433c80cf","Type":"ContainerStarted","Data":"e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6"} Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.562452 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9fq7f" event={"ID":"55051ec6-6c32-4004-8f1a-3069433c80cf","Type":"ContainerStarted","Data":"9a386331e3b78140396c88b6c78a20cc91e211cc90dca9bcbc4def22e653f8c0"} Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.567072 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.568045 4704 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-11-25 15:30:37 +0000 UTC, rotation deadline is 2026-08-26 00:46:41.41761259 +0000 UTC Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.568126 4704 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6561h11m2.849490072s for next certificate rotation Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.582184 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.600876 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.617649 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.632669 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.646407 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.662389 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.681751 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.700859 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.716806 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.735757 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.752880 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.772626 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.786357 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.799172 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.813542 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.826356 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.842361 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.848775 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5kt46"] Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.850148 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.852178 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.852366 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.852844 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.852845 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.853305 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.853635 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.853823 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.865349 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.887668 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.902540 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.917587 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f5274608-0c76-48d9-949d-53254df99b83-ovnkube-config\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.917635 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-var-lib-openvswitch\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.917653 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-node-log\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.917673 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-log-socket\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.917690 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-cni-bin\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.917727 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f5274608-0c76-48d9-949d-53254df99b83-env-overrides\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.917763 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-run-openvswitch\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.917837 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.917861 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw5d9\" (UniqueName: \"kubernetes.io/projected/f5274608-0c76-48d9-949d-53254df99b83-kube-api-access-cw5d9\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.917892 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-run-ovn-kubernetes\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.917911 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-cni-netd\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.917945 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-run-systemd\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.917965 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-etc-openvswitch\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.917984 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f5274608-0c76-48d9-949d-53254df99b83-ovn-node-metrics-cert\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.918018 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-run-netns\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.918038 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-kubelet\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.918059 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f5274608-0c76-48d9-949d-53254df99b83-ovnkube-script-lib\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.918088 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-systemd-units\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.918108 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-slash\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.918129 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-run-ovn\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.918856 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.943501 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.967928 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.983782 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:38 crc kubenswrapper[4704]: I1125 15:35:38.999003 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.013997 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.019162 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.019224 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw5d9\" (UniqueName: \"kubernetes.io/projected/f5274608-0c76-48d9-949d-53254df99b83-kube-api-access-cw5d9\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.019294 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-run-ovn-kubernetes\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.019333 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-cni-netd\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.019351 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-run-systemd\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.019368 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-etc-openvswitch\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.019364 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.019389 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f5274608-0c76-48d9-949d-53254df99b83-ovn-node-metrics-cert\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.019483 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-run-ovn-kubernetes\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.019559 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-run-systemd\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.019504 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-run-netns\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.019597 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-cni-netd\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.019534 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-run-netns\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.019631 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-kubelet\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.019666 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f5274608-0c76-48d9-949d-53254df99b83-ovnkube-script-lib\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.019622 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-etc-openvswitch\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.019717 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-systemd-units\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.019731 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-kubelet\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.019741 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-slash\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.019773 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-slash\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.019778 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-run-ovn\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.019837 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-systemd-units\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.019860 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f5274608-0c76-48d9-949d-53254df99b83-ovnkube-config\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.019886 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-run-ovn\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.019908 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-var-lib-openvswitch\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.019939 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-node-log\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.019960 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-log-socket\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.019985 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-cni-bin\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.020005 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f5274608-0c76-48d9-949d-53254df99b83-env-overrides\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.020028 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-run-openvswitch\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.020087 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-run-openvswitch\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.020113 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-var-lib-openvswitch\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.020142 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-node-log\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.020206 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-cni-bin\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.020268 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-log-socket\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.020840 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f5274608-0c76-48d9-949d-53254df99b83-ovnkube-config\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.020910 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f5274608-0c76-48d9-949d-53254df99b83-ovnkube-script-lib\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.020938 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f5274608-0c76-48d9-949d-53254df99b83-env-overrides\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.023752 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f5274608-0c76-48d9-949d-53254df99b83-ovn-node-metrics-cert\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.027463 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.036106 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw5d9\" (UniqueName: \"kubernetes.io/projected/f5274608-0c76-48d9-949d-53254df99b83-kube-api-access-cw5d9\") pod \"ovnkube-node-5kt46\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.043916 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.060560 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.077271 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.116722 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.157904 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.163044 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:39 crc kubenswrapper[4704]: W1125 15:35:39.176821 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5274608_0c76_48d9_949d_53254df99b83.slice/crio-86a6e2b98ed5c37e9f0fbaecb72bca014099df11c423004fd1ad2fc6d2720538 WatchSource:0}: Error finding container 86a6e2b98ed5c37e9f0fbaecb72bca014099df11c423004fd1ad2fc6d2720538: Status 404 returned error can't find the container with id 86a6e2b98ed5c37e9f0fbaecb72bca014099df11c423004fd1ad2fc6d2720538 Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.199855 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.241639 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.278655 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.415834 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:35:39 crc kubenswrapper[4704]: E1125 15:35:39.415977 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.568395 4704 generic.go:334] "Generic (PLEG): container finished" podID="f430d4f1-803f-4dc6-a319-4e0b8836cf1e" containerID="159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d" exitCode=0 Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.568506 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" event={"ID":"f430d4f1-803f-4dc6-a319-4e0b8836cf1e","Type":"ContainerDied","Data":"159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d"} Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.569855 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" event={"ID":"f5274608-0c76-48d9-949d-53254df99b83","Type":"ContainerStarted","Data":"86a6e2b98ed5c37e9f0fbaecb72bca014099df11c423004fd1ad2fc6d2720538"} Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.592604 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.606905 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.620433 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.634598 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.649942 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.663532 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.684117 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.716532 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.735018 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.766184 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.787401 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.804067 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.819394 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.837362 4704 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.838349 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.839627 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.839665 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.839678 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.839827 4704 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.898254 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.911341 4704 kubelet_node_status.go:115] "Node was previously registered" node="crc" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.911782 4704 kubelet_node_status.go:79] "Successfully registered node" node="crc" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.916428 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.916725 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.916970 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.917007 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.917034 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:39Z","lastTransitionTime":"2025-11-25T15:35:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:39 crc kubenswrapper[4704]: E1125 15:35:39.939356 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.943332 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.943371 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.943381 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.943398 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.943411 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:39Z","lastTransitionTime":"2025-11-25T15:35:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:39 crc kubenswrapper[4704]: E1125 15:35:39.960407 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.964071 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.964112 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.964123 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.964143 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.964159 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:39Z","lastTransitionTime":"2025-11-25T15:35:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:39 crc kubenswrapper[4704]: E1125 15:35:39.976851 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.981147 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.981196 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.981210 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.981233 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.981247 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:39Z","lastTransitionTime":"2025-11-25T15:35:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:39 crc kubenswrapper[4704]: E1125 15:35:39.993337 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:39Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.997062 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.997109 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.997120 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.997139 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:39 crc kubenswrapper[4704]: I1125 15:35:39.997166 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:39Z","lastTransitionTime":"2025-11-25T15:35:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:40 crc kubenswrapper[4704]: E1125 15:35:40.009597 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:40 crc kubenswrapper[4704]: E1125 15:35:40.009868 4704 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.011287 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.011311 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.011322 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.011339 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.011351 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:40Z","lastTransitionTime":"2025-11-25T15:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.114039 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.114086 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.114096 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.114114 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.114124 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:40Z","lastTransitionTime":"2025-11-25T15:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.217604 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.217651 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.217662 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.217683 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.217695 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:40Z","lastTransitionTime":"2025-11-25T15:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.320629 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.320662 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.320671 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.320686 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.320700 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:40Z","lastTransitionTime":"2025-11-25T15:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.416110 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.416214 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:35:40 crc kubenswrapper[4704]: E1125 15:35:40.416867 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:35:40 crc kubenswrapper[4704]: E1125 15:35:40.416992 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.423550 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.423613 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.423632 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.423662 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.423690 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:40Z","lastTransitionTime":"2025-11-25T15:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.526524 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.526591 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.526605 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.526628 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.526643 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:40Z","lastTransitionTime":"2025-11-25T15:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.575262 4704 generic.go:334] "Generic (PLEG): container finished" podID="f430d4f1-803f-4dc6-a319-4e0b8836cf1e" containerID="8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e" exitCode=0 Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.575331 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" event={"ID":"f430d4f1-803f-4dc6-a319-4e0b8836cf1e","Type":"ContainerDied","Data":"8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e"} Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.577625 4704 generic.go:334] "Generic (PLEG): container finished" podID="f5274608-0c76-48d9-949d-53254df99b83" containerID="58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58" exitCode=0 Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.577656 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" event={"ID":"f5274608-0c76-48d9-949d-53254df99b83","Type":"ContainerDied","Data":"58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58"} Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.594926 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.610478 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.623742 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.629465 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.629505 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.629519 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.629541 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.629555 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:40Z","lastTransitionTime":"2025-11-25T15:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.639455 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.663677 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.680240 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.692369 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.713041 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.726059 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.732596 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.732636 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.732645 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.732662 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.732673 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:40Z","lastTransitionTime":"2025-11-25T15:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.737353 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.756712 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.771250 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.789929 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.804999 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.819915 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.833590 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.836119 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.836158 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.836167 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.836186 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.836199 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:40Z","lastTransitionTime":"2025-11-25T15:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.846512 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.860904 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.875373 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.897479 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.915348 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.929546 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.938286 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.938322 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.938332 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.938350 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.938361 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:40Z","lastTransitionTime":"2025-11-25T15:35:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.948543 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.961510 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.977670 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:40 crc kubenswrapper[4704]: I1125 15:35:40.990746 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:40Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.006226 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.039657 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.040999 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.041024 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.041052 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.041070 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.041080 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:41Z","lastTransitionTime":"2025-11-25T15:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.078584 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.119272 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.144959 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.145020 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.145034 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.145054 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.145126 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:41Z","lastTransitionTime":"2025-11-25T15:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.247326 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.247368 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.247379 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.247396 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.247406 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:41Z","lastTransitionTime":"2025-11-25T15:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.350278 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.350327 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.350338 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.350355 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.350367 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:41Z","lastTransitionTime":"2025-11-25T15:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.415939 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:35:41 crc kubenswrapper[4704]: E1125 15:35:41.416088 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.452659 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.452696 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.452705 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.452718 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.452729 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:41Z","lastTransitionTime":"2025-11-25T15:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.555457 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.555502 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.555513 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.555529 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.555539 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:41Z","lastTransitionTime":"2025-11-25T15:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.584444 4704 generic.go:334] "Generic (PLEG): container finished" podID="f430d4f1-803f-4dc6-a319-4e0b8836cf1e" containerID="535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b" exitCode=0 Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.584524 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" event={"ID":"f430d4f1-803f-4dc6-a319-4e0b8836cf1e","Type":"ContainerDied","Data":"535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b"} Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.588440 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" event={"ID":"f5274608-0c76-48d9-949d-53254df99b83","Type":"ContainerStarted","Data":"b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2"} Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.588483 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" event={"ID":"f5274608-0c76-48d9-949d-53254df99b83","Type":"ContainerStarted","Data":"d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10"} Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.588495 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" event={"ID":"f5274608-0c76-48d9-949d-53254df99b83","Type":"ContainerStarted","Data":"10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7"} Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.588507 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" event={"ID":"f5274608-0c76-48d9-949d-53254df99b83","Type":"ContainerStarted","Data":"9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc"} Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.588517 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" event={"ID":"f5274608-0c76-48d9-949d-53254df99b83","Type":"ContainerStarted","Data":"7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a"} Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.588528 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" event={"ID":"f5274608-0c76-48d9-949d-53254df99b83","Type":"ContainerStarted","Data":"1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a"} Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.597078 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.611561 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.624608 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.638679 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.658726 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.658792 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.658819 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.658843 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.658896 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:41Z","lastTransitionTime":"2025-11-25T15:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.659870 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.672745 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.692485 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.721527 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.753852 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.761331 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.761391 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.761408 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.761434 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.761451 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:41Z","lastTransitionTime":"2025-11-25T15:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.768283 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.782493 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.796413 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.812332 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.826101 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.838432 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:41Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.865372 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.865431 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.865441 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.865459 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.865472 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:41Z","lastTransitionTime":"2025-11-25T15:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.968658 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.968711 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.968723 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.968743 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:41 crc kubenswrapper[4704]: I1125 15:35:41.968757 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:41Z","lastTransitionTime":"2025-11-25T15:35:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.052882 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.053074 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:35:42 crc kubenswrapper[4704]: E1125 15:35:42.053108 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:35:50.05307779 +0000 UTC m=+36.321351561 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.053168 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:35:42 crc kubenswrapper[4704]: E1125 15:35:42.053191 4704 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:35:42 crc kubenswrapper[4704]: E1125 15:35:42.053252 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:35:50.053238785 +0000 UTC m=+36.321512756 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:35:42 crc kubenswrapper[4704]: E1125 15:35:42.053306 4704 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:35:42 crc kubenswrapper[4704]: E1125 15:35:42.053346 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:35:50.053336777 +0000 UTC m=+36.321610558 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.070632 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.070665 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.070674 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.070688 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.070698 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:42Z","lastTransitionTime":"2025-11-25T15:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.153849 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.153939 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:35:42 crc kubenswrapper[4704]: E1125 15:35:42.154068 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:35:42 crc kubenswrapper[4704]: E1125 15:35:42.154084 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:35:42 crc kubenswrapper[4704]: E1125 15:35:42.154095 4704 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:35:42 crc kubenswrapper[4704]: E1125 15:35:42.154122 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:35:42 crc kubenswrapper[4704]: E1125 15:35:42.154162 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:35:42 crc kubenswrapper[4704]: E1125 15:35:42.154180 4704 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:35:42 crc kubenswrapper[4704]: E1125 15:35:42.154144 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 15:35:50.154130772 +0000 UTC m=+36.422404553 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:35:42 crc kubenswrapper[4704]: E1125 15:35:42.154267 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 15:35:50.154240455 +0000 UTC m=+36.422514376 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.173439 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.173477 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.173486 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.173502 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.173511 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:42Z","lastTransitionTime":"2025-11-25T15:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.276440 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.276486 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.276494 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.276515 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.276525 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:42Z","lastTransitionTime":"2025-11-25T15:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.378476 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.378537 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.378548 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.378563 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.378573 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:42Z","lastTransitionTime":"2025-11-25T15:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.416055 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.416126 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:35:42 crc kubenswrapper[4704]: E1125 15:35:42.416196 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:35:42 crc kubenswrapper[4704]: E1125 15:35:42.416280 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.481840 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.481941 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.481954 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.481974 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.481988 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:42Z","lastTransitionTime":"2025-11-25T15:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.583976 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.584245 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.584304 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.584365 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.584428 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:42Z","lastTransitionTime":"2025-11-25T15:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.592950 4704 generic.go:334] "Generic (PLEG): container finished" podID="f430d4f1-803f-4dc6-a319-4e0b8836cf1e" containerID="22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc" exitCode=0 Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.592991 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" event={"ID":"f430d4f1-803f-4dc6-a319-4e0b8836cf1e","Type":"ContainerDied","Data":"22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc"} Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.613067 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.631006 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.644920 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.664284 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.678603 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.689387 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.689426 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.689439 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.689458 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.689470 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:42Z","lastTransitionTime":"2025-11-25T15:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.692406 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.707810 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.721298 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.733980 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.746107 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.757413 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.772466 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.784971 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.792252 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.792285 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.792302 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.792320 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.792332 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:42Z","lastTransitionTime":"2025-11-25T15:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.796327 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.808980 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:42Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.895141 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.895198 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.895208 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.895233 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.895248 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:42Z","lastTransitionTime":"2025-11-25T15:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.997727 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.997777 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.997810 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.997829 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:42 crc kubenswrapper[4704]: I1125 15:35:42.997843 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:42Z","lastTransitionTime":"2025-11-25T15:35:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.100349 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.100398 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.100412 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.100431 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.100446 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:43Z","lastTransitionTime":"2025-11-25T15:35:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.202785 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.202841 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.202849 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.202866 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.202877 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:43Z","lastTransitionTime":"2025-11-25T15:35:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.306063 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.306126 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.306143 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.306169 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.306185 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:43Z","lastTransitionTime":"2025-11-25T15:35:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.408849 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.408894 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.408907 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.408924 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.408937 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:43Z","lastTransitionTime":"2025-11-25T15:35:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.415864 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:35:43 crc kubenswrapper[4704]: E1125 15:35:43.415995 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.512021 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.512067 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.512080 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.512101 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.512114 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:43Z","lastTransitionTime":"2025-11-25T15:35:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.599952 4704 generic.go:334] "Generic (PLEG): container finished" podID="f430d4f1-803f-4dc6-a319-4e0b8836cf1e" containerID="55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7" exitCode=0 Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.600057 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" event={"ID":"f430d4f1-803f-4dc6-a319-4e0b8836cf1e","Type":"ContainerDied","Data":"55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7"} Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.607105 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" event={"ID":"f5274608-0c76-48d9-949d-53254df99b83","Type":"ContainerStarted","Data":"717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4"} Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.616515 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.616608 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.616627 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.616654 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.616683 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:43Z","lastTransitionTime":"2025-11-25T15:35:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.621626 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.635446 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.650430 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.665234 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.679386 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.693111 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.713498 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.719734 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.719841 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.719850 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.719889 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.719900 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:43Z","lastTransitionTime":"2025-11-25T15:35:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.728039 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.741921 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.761775 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.775874 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.791354 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.806762 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.819595 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.822272 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.822304 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.822314 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.822331 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.822343 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:43Z","lastTransitionTime":"2025-11-25T15:35:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.834687 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:43Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.926876 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.927329 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.927371 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.927392 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:43 crc kubenswrapper[4704]: I1125 15:35:43.927405 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:43Z","lastTransitionTime":"2025-11-25T15:35:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.030400 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.030476 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.030494 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.030524 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.030541 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:44Z","lastTransitionTime":"2025-11-25T15:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.132933 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.132973 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.132984 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.133000 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.133011 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:44Z","lastTransitionTime":"2025-11-25T15:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.233506 4704 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.237499 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.237545 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.237556 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.237575 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.237587 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:44Z","lastTransitionTime":"2025-11-25T15:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.340278 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.340320 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.340329 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.340347 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.340357 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:44Z","lastTransitionTime":"2025-11-25T15:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.416029 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.416049 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:35:44 crc kubenswrapper[4704]: E1125 15:35:44.416724 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:35:44 crc kubenswrapper[4704]: E1125 15:35:44.416812 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.429601 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.443343 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.443414 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.443561 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.443575 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.443593 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.443606 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:44Z","lastTransitionTime":"2025-11-25T15:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.459396 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.474931 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.496432 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.519661 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.538852 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.546019 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.546056 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.546065 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.546079 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.546088 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:44Z","lastTransitionTime":"2025-11-25T15:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.551525 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.567119 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.578903 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.589049 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.603527 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.648905 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.648944 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.648953 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.648971 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.648980 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:44Z","lastTransitionTime":"2025-11-25T15:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.653201 4704 generic.go:334] "Generic (PLEG): container finished" podID="f430d4f1-803f-4dc6-a319-4e0b8836cf1e" containerID="490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469" exitCode=0 Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.653251 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" event={"ID":"f430d4f1-803f-4dc6-a319-4e0b8836cf1e","Type":"ContainerDied","Data":"490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469"} Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.685897 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.707261 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.742136 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.763583 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.763623 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.763634 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.763653 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.763664 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:44Z","lastTransitionTime":"2025-11-25T15:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.763525 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.776138 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.786227 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.804481 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.819197 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.831076 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.846256 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.861394 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.869163 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.869249 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.869267 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.869289 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.869303 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:44Z","lastTransitionTime":"2025-11-25T15:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.875698 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.887992 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.899131 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.913273 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.925604 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.939167 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.954075 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:44Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.972089 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.972125 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.972135 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.972150 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:44 crc kubenswrapper[4704]: I1125 15:35:44.972159 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:44Z","lastTransitionTime":"2025-11-25T15:35:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.074615 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.074680 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.074698 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.074722 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.074735 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:45Z","lastTransitionTime":"2025-11-25T15:35:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.177609 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.177666 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.177681 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.177701 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.177713 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:45Z","lastTransitionTime":"2025-11-25T15:35:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.281073 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.281125 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.281143 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.281167 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.281186 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:45Z","lastTransitionTime":"2025-11-25T15:35:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.384560 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.384621 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.384630 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.384648 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.384662 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:45Z","lastTransitionTime":"2025-11-25T15:35:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.415405 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:35:45 crc kubenswrapper[4704]: E1125 15:35:45.415550 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.487332 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.487381 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.487392 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.487411 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.487424 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:45Z","lastTransitionTime":"2025-11-25T15:35:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.590211 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.590267 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.590279 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.590300 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.590312 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:45Z","lastTransitionTime":"2025-11-25T15:35:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.660978 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" event={"ID":"f430d4f1-803f-4dc6-a319-4e0b8836cf1e","Type":"ContainerStarted","Data":"c377f1740ee06c0676a0f786ba5b15eac00cbeebf162a9ad465a505c3183a652"} Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.667036 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" event={"ID":"f5274608-0c76-48d9-949d-53254df99b83","Type":"ContainerStarted","Data":"8115eae7e33c40ba886d5faf6e669b73b04e8004dae9c19fdd6ce096ee690e08"} Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.667327 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.667345 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.686209 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:45Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.692178 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.692202 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.692211 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.692224 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.692233 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:45Z","lastTransitionTime":"2025-11-25T15:35:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.703778 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:45Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.716337 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:45Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.733088 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.733161 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.737519 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:45Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.744850 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.750427 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:45Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.772792 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c377f1740ee06c0676a0f786ba5b15eac00cbeebf162a9ad465a505c3183a652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:45Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.789927 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:45Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.795291 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.795331 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.795341 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.795360 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.795371 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:45Z","lastTransitionTime":"2025-11-25T15:35:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.806684 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:45Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.826433 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:45Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.841287 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:45Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.855763 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:45Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.873941 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:45Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.895386 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:45Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.898241 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.898294 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.898308 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.898329 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.898343 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:45Z","lastTransitionTime":"2025-11-25T15:35:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.912822 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:45Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.929719 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:45Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.948342 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c377f1740ee06c0676a0f786ba5b15eac00cbeebf162a9ad465a505c3183a652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:45Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.963357 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:45Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.977471 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:45Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:45 crc kubenswrapper[4704]: I1125 15:35:45.991073 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:45Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.001395 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.001459 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.001473 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.001491 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.001504 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:46Z","lastTransitionTime":"2025-11-25T15:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.012972 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.025100 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.064567 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.078639 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.090738 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.102327 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.103918 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.103978 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.103989 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.104013 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.104034 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:46Z","lastTransitionTime":"2025-11-25T15:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.115497 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.139031 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.158737 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.176671 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.203772 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8115eae7e33c40ba886d5faf6e669b73b04e8004dae9c19fdd6ce096ee690e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:46Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.206338 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.206396 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.206410 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.206429 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.206442 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:46Z","lastTransitionTime":"2025-11-25T15:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.309655 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.309701 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.309713 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.309732 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.309745 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:46Z","lastTransitionTime":"2025-11-25T15:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.412879 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.412936 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.412951 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.412969 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.412986 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:46Z","lastTransitionTime":"2025-11-25T15:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.415632 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.415798 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:35:46 crc kubenswrapper[4704]: E1125 15:35:46.415779 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:35:46 crc kubenswrapper[4704]: E1125 15:35:46.416079 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.515154 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.515620 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.515723 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.515808 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.515885 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:46Z","lastTransitionTime":"2025-11-25T15:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.621027 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.621100 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.621115 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.621137 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.621152 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:46Z","lastTransitionTime":"2025-11-25T15:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.670846 4704 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.724684 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.724742 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.724754 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.724781 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.724814 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:46Z","lastTransitionTime":"2025-11-25T15:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.826850 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.826885 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.826893 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.826908 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.826917 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:46Z","lastTransitionTime":"2025-11-25T15:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.934431 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.934491 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.934502 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.934524 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:46 crc kubenswrapper[4704]: I1125 15:35:46.934538 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:46Z","lastTransitionTime":"2025-11-25T15:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.037017 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.037065 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.037086 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.037102 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.037114 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:47Z","lastTransitionTime":"2025-11-25T15:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.140272 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.140317 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.140327 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.140348 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.140361 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:47Z","lastTransitionTime":"2025-11-25T15:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.243415 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.243470 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.243483 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.243501 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.243514 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:47Z","lastTransitionTime":"2025-11-25T15:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.346099 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.346156 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.346167 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.346187 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.346201 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:47Z","lastTransitionTime":"2025-11-25T15:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.416394 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:35:47 crc kubenswrapper[4704]: E1125 15:35:47.416543 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.449034 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.449076 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.449085 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.449104 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.449117 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:47Z","lastTransitionTime":"2025-11-25T15:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.552366 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.552415 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.552429 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.552447 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.552458 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:47Z","lastTransitionTime":"2025-11-25T15:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.655047 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.655109 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.655123 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.655148 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.655162 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:47Z","lastTransitionTime":"2025-11-25T15:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.673676 4704 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.758338 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.758820 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.758836 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.758854 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.758869 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:47Z","lastTransitionTime":"2025-11-25T15:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.861750 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.861822 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.861833 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.861852 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.861865 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:47Z","lastTransitionTime":"2025-11-25T15:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.964391 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.964429 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.964439 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.964454 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:47 crc kubenswrapper[4704]: I1125 15:35:47.964466 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:47Z","lastTransitionTime":"2025-11-25T15:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.066970 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.067030 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.067044 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.067066 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.067080 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:48Z","lastTransitionTime":"2025-11-25T15:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.169969 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.170020 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.170029 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.170048 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.170061 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:48Z","lastTransitionTime":"2025-11-25T15:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.272047 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.272092 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.272106 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.272127 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.272143 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:48Z","lastTransitionTime":"2025-11-25T15:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.375397 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.375455 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.375471 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.375491 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.375504 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:48Z","lastTransitionTime":"2025-11-25T15:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.415739 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.415897 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:35:48 crc kubenswrapper[4704]: E1125 15:35:48.415981 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:35:48 crc kubenswrapper[4704]: E1125 15:35:48.416087 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.477745 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.477816 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.477835 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.477856 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.477875 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:48Z","lastTransitionTime":"2025-11-25T15:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.580066 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.580124 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.580136 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.580155 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.580169 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:48Z","lastTransitionTime":"2025-11-25T15:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.682647 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.682684 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.682694 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.682713 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.682726 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:48Z","lastTransitionTime":"2025-11-25T15:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.785305 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.785343 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.785352 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.785369 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.785378 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:48Z","lastTransitionTime":"2025-11-25T15:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.887949 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.887993 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.888002 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.888017 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.888042 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:48Z","lastTransitionTime":"2025-11-25T15:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.990928 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.990970 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.990982 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.991000 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:48 crc kubenswrapper[4704]: I1125 15:35:48.991016 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:48Z","lastTransitionTime":"2025-11-25T15:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.093829 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.093886 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.093897 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.093914 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.093927 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:49Z","lastTransitionTime":"2025-11-25T15:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.197824 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.197874 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.197884 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.197909 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.197919 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:49Z","lastTransitionTime":"2025-11-25T15:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.303460 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.303519 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.303531 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.303549 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.303563 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:49Z","lastTransitionTime":"2025-11-25T15:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.405761 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.406135 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.406197 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.406284 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.406367 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:49Z","lastTransitionTime":"2025-11-25T15:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.415568 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:35:49 crc kubenswrapper[4704]: E1125 15:35:49.415765 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.510218 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.510267 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.510282 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.510355 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.510368 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:49Z","lastTransitionTime":"2025-11-25T15:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.612979 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.613302 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.613382 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.613456 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.613515 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:49Z","lastTransitionTime":"2025-11-25T15:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.680271 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5kt46_f5274608-0c76-48d9-949d-53254df99b83/ovnkube-controller/0.log" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.682936 4704 generic.go:334] "Generic (PLEG): container finished" podID="f5274608-0c76-48d9-949d-53254df99b83" containerID="8115eae7e33c40ba886d5faf6e669b73b04e8004dae9c19fdd6ce096ee690e08" exitCode=1 Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.682976 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" event={"ID":"f5274608-0c76-48d9-949d-53254df99b83","Type":"ContainerDied","Data":"8115eae7e33c40ba886d5faf6e669b73b04e8004dae9c19fdd6ce096ee690e08"} Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.683679 4704 scope.go:117] "RemoveContainer" containerID="8115eae7e33c40ba886d5faf6e669b73b04e8004dae9c19fdd6ce096ee690e08" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.696253 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:49Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.711355 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:49Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.716219 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.716251 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.716261 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.716277 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.716287 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:49Z","lastTransitionTime":"2025-11-25T15:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.723928 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:49Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.740898 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:49Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.761776 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:49Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.774606 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:49Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.787021 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:49Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.806150 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8115eae7e33c40ba886d5faf6e669b73b04e8004dae9c19fdd6ce096ee690e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8115eae7e33c40ba886d5faf6e669b73b04e8004dae9c19fdd6ce096ee690e08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:35:48Z\\\",\\\"message\\\":\\\"eflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:35:48.693505 6021 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:35:48.693715 6021 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:35:48.694006 6021 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 15:35:48.694090 6021 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1125 15:35:48.694124 6021 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:35:48.694130 6021 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 15:35:48.694143 6021 factory.go:656] Stopping watch factory\\\\nI1125 15:35:48.694149 6021 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1125 15:35:48.694158 6021 ovnkube.go:599] Stopped ovnkube\\\\nI1125 15:35:48.694163 6021 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1125 15:35:48.694173 6021 handler.go:208] Removed *v1.Node event handler 2\\\\nI11\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:49Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.818487 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.818537 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.818547 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.818511 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:49Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.818565 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.818724 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:49Z","lastTransitionTime":"2025-11-25T15:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.831644 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:49Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.847379 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:49Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.863737 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:49Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.878466 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c377f1740ee06c0676a0f786ba5b15eac00cbeebf162a9ad465a505c3183a652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:49Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.889963 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:49Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.898169 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:49Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.921193 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.921389 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.921452 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.921526 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:49 crc kubenswrapper[4704]: I1125 15:35:49.921582 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:49Z","lastTransitionTime":"2025-11-25T15:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.025257 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.025306 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.025318 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.025336 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.025348 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:50Z","lastTransitionTime":"2025-11-25T15:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.127619 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.127665 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.127678 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.127693 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.127702 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:50Z","lastTransitionTime":"2025-11-25T15:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.133206 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.133266 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.133288 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:35:50 crc kubenswrapper[4704]: E1125 15:35:50.133405 4704 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:35:50 crc kubenswrapper[4704]: E1125 15:35:50.133440 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:36:06.13340555 +0000 UTC m=+52.401679391 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:35:50 crc kubenswrapper[4704]: E1125 15:35:50.133488 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:36:06.133469632 +0000 UTC m=+52.401743453 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:35:50 crc kubenswrapper[4704]: E1125 15:35:50.133511 4704 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:35:50 crc kubenswrapper[4704]: E1125 15:35:50.133606 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:36:06.133585905 +0000 UTC m=+52.401859716 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.155378 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.155414 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.155422 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.155437 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.155448 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:50Z","lastTransitionTime":"2025-11-25T15:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:50 crc kubenswrapper[4704]: E1125 15:35:50.173067 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:50Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.178005 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.178042 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.178054 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.178072 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.178083 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:50Z","lastTransitionTime":"2025-11-25T15:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:50 crc kubenswrapper[4704]: E1125 15:35:50.190715 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:50Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.195677 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.195733 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.195751 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.195776 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.195834 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:50Z","lastTransitionTime":"2025-11-25T15:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:50 crc kubenswrapper[4704]: E1125 15:35:50.210004 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:50Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.214037 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.214080 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.214097 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.214118 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.214132 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:50Z","lastTransitionTime":"2025-11-25T15:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:50 crc kubenswrapper[4704]: E1125 15:35:50.226847 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:50Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.230712 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.230772 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.230828 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.230853 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.230878 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:50Z","lastTransitionTime":"2025-11-25T15:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.234212 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.234270 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:35:50 crc kubenswrapper[4704]: E1125 15:35:50.234400 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:35:50 crc kubenswrapper[4704]: E1125 15:35:50.234408 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:35:50 crc kubenswrapper[4704]: E1125 15:35:50.234422 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:35:50 crc kubenswrapper[4704]: E1125 15:35:50.234434 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:35:50 crc kubenswrapper[4704]: E1125 15:35:50.234440 4704 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:35:50 crc kubenswrapper[4704]: E1125 15:35:50.234447 4704 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:35:50 crc kubenswrapper[4704]: E1125 15:35:50.234498 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 15:36:06.234483803 +0000 UTC m=+52.502757584 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:35:50 crc kubenswrapper[4704]: E1125 15:35:50.234515 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 15:36:06.234509043 +0000 UTC m=+52.502782824 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:35:50 crc kubenswrapper[4704]: E1125 15:35:50.243340 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:50Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:50 crc kubenswrapper[4704]: E1125 15:35:50.243496 4704 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.245074 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.245115 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.245127 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.245143 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.245153 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:50Z","lastTransitionTime":"2025-11-25T15:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.348028 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.348305 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.348419 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.348485 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.348565 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:50Z","lastTransitionTime":"2025-11-25T15:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.417238 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:35:50 crc kubenswrapper[4704]: E1125 15:35:50.417445 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.417739 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:35:50 crc kubenswrapper[4704]: E1125 15:35:50.418003 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.451363 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.451767 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.451896 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.451979 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.452053 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:50Z","lastTransitionTime":"2025-11-25T15:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.512661 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.554559 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.554606 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.554616 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.554633 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.554644 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:50Z","lastTransitionTime":"2025-11-25T15:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.656486 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.656522 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.656531 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.656545 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.656554 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:50Z","lastTransitionTime":"2025-11-25T15:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.689634 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5kt46_f5274608-0c76-48d9-949d-53254df99b83/ovnkube-controller/0.log" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.693837 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" event={"ID":"f5274608-0c76-48d9-949d-53254df99b83","Type":"ContainerStarted","Data":"d044b149e11472a38ca4b46510d8f96fbe5e2335925211ebd02319e97fa76d70"} Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.758319 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.758358 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.758370 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.758388 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.758428 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:50Z","lastTransitionTime":"2025-11-25T15:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.860654 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.860699 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.860711 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.860727 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.860736 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:50Z","lastTransitionTime":"2025-11-25T15:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.963102 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.963137 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.963144 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.963161 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:50 crc kubenswrapper[4704]: I1125 15:35:50.963170 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:50Z","lastTransitionTime":"2025-11-25T15:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.060267 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97"] Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.060758 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.063114 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.065522 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.072138 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.072180 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.072192 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.072209 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.072223 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:51Z","lastTransitionTime":"2025-11-25T15:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.084215 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8115eae7e33c40ba886d5faf6e669b73b04e8004dae9c19fdd6ce096ee690e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8115eae7e33c40ba886d5faf6e669b73b04e8004dae9c19fdd6ce096ee690e08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:35:48Z\\\",\\\"message\\\":\\\"eflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:35:48.693505 6021 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:35:48.693715 6021 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:35:48.694006 6021 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 15:35:48.694090 6021 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1125 15:35:48.694124 6021 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:35:48.694130 6021 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 15:35:48.694143 6021 factory.go:656] Stopping watch factory\\\\nI1125 15:35:48.694149 6021 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1125 15:35:48.694158 6021 ovnkube.go:599] Stopped ovnkube\\\\nI1125 15:35:48.694163 6021 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1125 15:35:48.694173 6021 handler.go:208] Removed *v1.Node event handler 2\\\\nI11\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.103111 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.114691 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.125833 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.138187 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.145081 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8801589f-7db3-4c55-9232-29b5417286d2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-kct97\" (UID: \"8801589f-7db3-4c55-9232-29b5417286d2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.145177 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8801589f-7db3-4c55-9232-29b5417286d2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-kct97\" (UID: \"8801589f-7db3-4c55-9232-29b5417286d2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.145215 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j29nl\" (UniqueName: \"kubernetes.io/projected/8801589f-7db3-4c55-9232-29b5417286d2-kube-api-access-j29nl\") pod \"ovnkube-control-plane-749d76644c-kct97\" (UID: \"8801589f-7db3-4c55-9232-29b5417286d2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.145239 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8801589f-7db3-4c55-9232-29b5417286d2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-kct97\" (UID: \"8801589f-7db3-4c55-9232-29b5417286d2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.151849 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.162685 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.174408 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.174456 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.174467 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.174486 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.174498 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:51Z","lastTransitionTime":"2025-11-25T15:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.176902 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c377f1740ee06c0676a0f786ba5b15eac00cbeebf162a9ad465a505c3183a652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.188751 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.198756 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.207299 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.218826 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.228811 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801589f-7db3-4c55-9232-29b5417286d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kct97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.240882 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.245898 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8801589f-7db3-4c55-9232-29b5417286d2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-kct97\" (UID: \"8801589f-7db3-4c55-9232-29b5417286d2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.246269 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j29nl\" (UniqueName: \"kubernetes.io/projected/8801589f-7db3-4c55-9232-29b5417286d2-kube-api-access-j29nl\") pod \"ovnkube-control-plane-749d76644c-kct97\" (UID: \"8801589f-7db3-4c55-9232-29b5417286d2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.246531 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8801589f-7db3-4c55-9232-29b5417286d2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-kct97\" (UID: \"8801589f-7db3-4c55-9232-29b5417286d2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.246956 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8801589f-7db3-4c55-9232-29b5417286d2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-kct97\" (UID: \"8801589f-7db3-4c55-9232-29b5417286d2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.247605 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8801589f-7db3-4c55-9232-29b5417286d2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-kct97\" (UID: \"8801589f-7db3-4c55-9232-29b5417286d2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.247546 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8801589f-7db3-4c55-9232-29b5417286d2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-kct97\" (UID: \"8801589f-7db3-4c55-9232-29b5417286d2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.254285 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.255124 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8801589f-7db3-4c55-9232-29b5417286d2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-kct97\" (UID: \"8801589f-7db3-4c55-9232-29b5417286d2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.261159 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j29nl\" (UniqueName: \"kubernetes.io/projected/8801589f-7db3-4c55-9232-29b5417286d2-kube-api-access-j29nl\") pod \"ovnkube-control-plane-749d76644c-kct97\" (UID: \"8801589f-7db3-4c55-9232-29b5417286d2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.268257 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:51Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.277539 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.277844 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.277977 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.278072 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.278169 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:51Z","lastTransitionTime":"2025-11-25T15:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.381033 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.381077 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.381088 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.381104 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.381115 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:51Z","lastTransitionTime":"2025-11-25T15:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.384480 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" Nov 25 15:35:51 crc kubenswrapper[4704]: W1125 15:35:51.399535 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8801589f_7db3_4c55_9232_29b5417286d2.slice/crio-b2da1af96c8438eaf65dc4b7e0f7703eec384e184a06acf8e88dcb6fb16bdba1 WatchSource:0}: Error finding container b2da1af96c8438eaf65dc4b7e0f7703eec384e184a06acf8e88dcb6fb16bdba1: Status 404 returned error can't find the container with id b2da1af96c8438eaf65dc4b7e0f7703eec384e184a06acf8e88dcb6fb16bdba1 Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.415552 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:35:51 crc kubenswrapper[4704]: E1125 15:35:51.415845 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.483965 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.484019 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.484029 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.484045 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.484058 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:51Z","lastTransitionTime":"2025-11-25T15:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.586249 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.586288 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.586299 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.586316 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.586330 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:51Z","lastTransitionTime":"2025-11-25T15:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.689114 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.689170 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.689183 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.689206 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.689395 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:51Z","lastTransitionTime":"2025-11-25T15:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.698968 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" event={"ID":"8801589f-7db3-4c55-9232-29b5417286d2","Type":"ContainerStarted","Data":"b2da1af96c8438eaf65dc4b7e0f7703eec384e184a06acf8e88dcb6fb16bdba1"} Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.792275 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.792323 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.792334 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.792352 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.792364 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:51Z","lastTransitionTime":"2025-11-25T15:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.894890 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.894949 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.894964 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.894989 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.895008 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:51Z","lastTransitionTime":"2025-11-25T15:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.996643 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.996684 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.996695 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.996709 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:51 crc kubenswrapper[4704]: I1125 15:35:51.996719 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:51Z","lastTransitionTime":"2025-11-25T15:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.099972 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.100021 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.100035 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.100053 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.100065 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:52Z","lastTransitionTime":"2025-11-25T15:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.185050 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-z6lnx"] Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.186096 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:35:52 crc kubenswrapper[4704]: E1125 15:35:52.186188 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.203133 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.203189 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.203201 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.203221 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.203232 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:52Z","lastTransitionTime":"2025-11-25T15:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.211765 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.246839 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.260842 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-metrics-certs\") pod \"network-metrics-daemon-z6lnx\" (UID: \"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\") " pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.260929 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbgz5\" (UniqueName: \"kubernetes.io/projected/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-kube-api-access-wbgz5\") pod \"network-metrics-daemon-z6lnx\" (UID: \"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\") " pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.286839 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.303475 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.305625 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.305667 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.305678 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.305696 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.305708 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:52Z","lastTransitionTime":"2025-11-25T15:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.321848 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801589f-7db3-4c55-9232-29b5417286d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kct97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.334455 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6lnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6lnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.353394 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.361998 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbgz5\" (UniqueName: \"kubernetes.io/projected/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-kube-api-access-wbgz5\") pod \"network-metrics-daemon-z6lnx\" (UID: \"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\") " pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.362061 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-metrics-certs\") pod \"network-metrics-daemon-z6lnx\" (UID: \"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\") " pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:35:52 crc kubenswrapper[4704]: E1125 15:35:52.362215 4704 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:35:52 crc kubenswrapper[4704]: E1125 15:35:52.362274 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-metrics-certs podName:b9cf8fad-2f72-4a94-958b-dd58fc76f4df nodeName:}" failed. No retries permitted until 2025-11-25 15:35:52.862253377 +0000 UTC m=+39.130527158 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-metrics-certs") pod "network-metrics-daemon-z6lnx" (UID: "b9cf8fad-2f72-4a94-958b-dd58fc76f4df") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.366865 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.379490 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.385586 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbgz5\" (UniqueName: \"kubernetes.io/projected/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-kube-api-access-wbgz5\") pod \"network-metrics-daemon-z6lnx\" (UID: \"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\") " pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.399588 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8115eae7e33c40ba886d5faf6e669b73b04e8004dae9c19fdd6ce096ee690e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8115eae7e33c40ba886d5faf6e669b73b04e8004dae9c19fdd6ce096ee690e08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:35:48Z\\\",\\\"message\\\":\\\"eflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:35:48.693505 6021 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:35:48.693715 6021 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:35:48.694006 6021 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 15:35:48.694090 6021 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1125 15:35:48.694124 6021 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:35:48.694130 6021 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 15:35:48.694143 6021 factory.go:656] Stopping watch factory\\\\nI1125 15:35:48.694149 6021 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1125 15:35:48.694158 6021 ovnkube.go:599] Stopped ovnkube\\\\nI1125 15:35:48.694163 6021 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1125 15:35:48.694173 6021 handler.go:208] Removed *v1.Node event handler 2\\\\nI11\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.408098 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.408147 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.408162 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.408185 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.408198 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:52Z","lastTransitionTime":"2025-11-25T15:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.416461 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:35:52 crc kubenswrapper[4704]: E1125 15:35:52.416630 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.417156 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:35:52 crc kubenswrapper[4704]: E1125 15:35:52.417254 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.418368 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.432266 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.446107 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.459727 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.478337 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c377f1740ee06c0676a0f786ba5b15eac00cbeebf162a9ad465a505c3183a652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.489006 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.500387 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.511533 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.511586 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.511597 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.511616 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.511629 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:52Z","lastTransitionTime":"2025-11-25T15:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.614298 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.614335 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.614345 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.614364 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.614380 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:52Z","lastTransitionTime":"2025-11-25T15:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.703668 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5kt46_f5274608-0c76-48d9-949d-53254df99b83/ovnkube-controller/1.log" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.704857 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5kt46_f5274608-0c76-48d9-949d-53254df99b83/ovnkube-controller/0.log" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.708076 4704 generic.go:334] "Generic (PLEG): container finished" podID="f5274608-0c76-48d9-949d-53254df99b83" containerID="d044b149e11472a38ca4b46510d8f96fbe5e2335925211ebd02319e97fa76d70" exitCode=1 Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.708157 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" event={"ID":"f5274608-0c76-48d9-949d-53254df99b83","Type":"ContainerDied","Data":"d044b149e11472a38ca4b46510d8f96fbe5e2335925211ebd02319e97fa76d70"} Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.708402 4704 scope.go:117] "RemoveContainer" containerID="8115eae7e33c40ba886d5faf6e669b73b04e8004dae9c19fdd6ce096ee690e08" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.708871 4704 scope.go:117] "RemoveContainer" containerID="d044b149e11472a38ca4b46510d8f96fbe5e2335925211ebd02319e97fa76d70" Nov 25 15:35:52 crc kubenswrapper[4704]: E1125 15:35:52.709224 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5kt46_openshift-ovn-kubernetes(f5274608-0c76-48d9-949d-53254df99b83)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" podUID="f5274608-0c76-48d9-949d-53254df99b83" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.710391 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" event={"ID":"8801589f-7db3-4c55-9232-29b5417286d2","Type":"ContainerStarted","Data":"313f477ea11a405cea82c897651b0950eb648d63a44b5292a774c5d943f21483"} Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.710426 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" event={"ID":"8801589f-7db3-4c55-9232-29b5417286d2","Type":"ContainerStarted","Data":"b199609f9051ae77612ae12ecfe0c54c2cd7c008701075dc17cf4ff0f6d7b021"} Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.716742 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.716893 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.716909 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.716931 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.716947 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:52Z","lastTransitionTime":"2025-11-25T15:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.734666 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.750531 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.765499 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.791379 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d044b149e11472a38ca4b46510d8f96fbe5e2335925211ebd02319e97fa76d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8115eae7e33c40ba886d5faf6e669b73b04e8004dae9c19fdd6ce096ee690e08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:35:48Z\\\",\\\"message\\\":\\\"eflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:35:48.693505 6021 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:35:48.693715 6021 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:35:48.694006 6021 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 15:35:48.694090 6021 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1125 15:35:48.694124 6021 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:35:48.694130 6021 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 15:35:48.694143 6021 factory.go:656] Stopping watch factory\\\\nI1125 15:35:48.694149 6021 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1125 15:35:48.694158 6021 ovnkube.go:599] Stopped ovnkube\\\\nI1125 15:35:48.694163 6021 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1125 15:35:48.694173 6021 handler.go:208] Removed *v1.Node event handler 2\\\\nI11\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d044b149e11472a38ca4b46510d8f96fbe5e2335925211ebd02319e97fa76d70\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"er, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z]\\\\nI1125 15:35:52.084017 6162 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1125 15:35:52.084511 6162 obj_retry.go:303] Retry \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.809812 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c377f1740ee06c0676a0f786ba5b15eac00cbeebf162a9ad465a505c3183a652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.819924 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.819977 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.819989 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.820007 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.820018 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:52Z","lastTransitionTime":"2025-11-25T15:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.824895 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.840152 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.855847 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.867336 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-metrics-certs\") pod \"network-metrics-daemon-z6lnx\" (UID: \"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\") " pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:35:52 crc kubenswrapper[4704]: E1125 15:35:52.867736 4704 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:35:52 crc kubenswrapper[4704]: E1125 15:35:52.867833 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-metrics-certs podName:b9cf8fad-2f72-4a94-958b-dd58fc76f4df nodeName:}" failed. No retries permitted until 2025-11-25 15:35:53.867814564 +0000 UTC m=+40.136088355 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-metrics-certs") pod "network-metrics-daemon-z6lnx" (UID: "b9cf8fad-2f72-4a94-958b-dd58fc76f4df") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.868394 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.881805 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.894682 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.911702 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.921919 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.921987 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.922002 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.922025 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.922042 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:52Z","lastTransitionTime":"2025-11-25T15:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.927469 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.941997 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.958490 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.971662 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801589f-7db3-4c55-9232-29b5417286d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kct97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:52 crc kubenswrapper[4704]: I1125 15:35:52.984518 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6lnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6lnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.006047 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.020708 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.024577 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.024612 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.024624 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.024645 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.024656 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:53Z","lastTransitionTime":"2025-11-25T15:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.034586 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.054730 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d044b149e11472a38ca4b46510d8f96fbe5e2335925211ebd02319e97fa76d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8115eae7e33c40ba886d5faf6e669b73b04e8004dae9c19fdd6ce096ee690e08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:35:48Z\\\",\\\"message\\\":\\\"eflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:35:48.693505 6021 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:35:48.693715 6021 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1125 15:35:48.694006 6021 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 15:35:48.694090 6021 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1125 15:35:48.694124 6021 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 15:35:48.694130 6021 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 15:35:48.694143 6021 factory.go:656] Stopping watch factory\\\\nI1125 15:35:48.694149 6021 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1125 15:35:48.694158 6021 ovnkube.go:599] Stopped ovnkube\\\\nI1125 15:35:48.694163 6021 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1125 15:35:48.694173 6021 handler.go:208] Removed *v1.Node event handler 2\\\\nI11\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d044b149e11472a38ca4b46510d8f96fbe5e2335925211ebd02319e97fa76d70\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"er, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z]\\\\nI1125 15:35:52.084017 6162 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1125 15:35:52.084511 6162 obj_retry.go:303] Retry \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.069904 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.087263 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.100826 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.114636 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.127504 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.127543 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.127553 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.127570 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.127579 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:53Z","lastTransitionTime":"2025-11-25T15:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.130261 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c377f1740ee06c0676a0f786ba5b15eac00cbeebf162a9ad465a505c3183a652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.142709 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.155433 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.171720 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.183902 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.197811 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.211296 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.225458 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801589f-7db3-4c55-9232-29b5417286d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b199609f9051ae77612ae12ecfe0c54c2cd7c008701075dc17cf4ff0f6d7b021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://313f477ea11a405cea82c897651b0950eb648d63a44b5292a774c5d943f21483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kct97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.230322 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.230367 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.230381 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.230398 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.230411 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:53Z","lastTransitionTime":"2025-11-25T15:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.238438 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6lnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6lnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.333230 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.333279 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.333288 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.333304 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.333314 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:53Z","lastTransitionTime":"2025-11-25T15:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.416014 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.416081 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:35:53 crc kubenswrapper[4704]: E1125 15:35:53.416196 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:35:53 crc kubenswrapper[4704]: E1125 15:35:53.416287 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.435645 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.436016 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.436166 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.436253 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.436332 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:53Z","lastTransitionTime":"2025-11-25T15:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.539914 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.539966 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.539978 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.540000 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.540019 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:53Z","lastTransitionTime":"2025-11-25T15:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.643440 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.643494 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.643515 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.643533 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.643548 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:53Z","lastTransitionTime":"2025-11-25T15:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.716219 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5kt46_f5274608-0c76-48d9-949d-53254df99b83/ovnkube-controller/1.log" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.720369 4704 scope.go:117] "RemoveContainer" containerID="d044b149e11472a38ca4b46510d8f96fbe5e2335925211ebd02319e97fa76d70" Nov 25 15:35:53 crc kubenswrapper[4704]: E1125 15:35:53.720541 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5kt46_openshift-ovn-kubernetes(f5274608-0c76-48d9-949d-53254df99b83)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" podUID="f5274608-0c76-48d9-949d-53254df99b83" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.745572 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.747719 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.747780 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.747824 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.747850 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.747899 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:53Z","lastTransitionTime":"2025-11-25T15:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.762670 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.778256 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.801631 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d044b149e11472a38ca4b46510d8f96fbe5e2335925211ebd02319e97fa76d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d044b149e11472a38ca4b46510d8f96fbe5e2335925211ebd02319e97fa76d70\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"er, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z]\\\\nI1125 15:35:52.084017 6162 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1125 15:35:52.084511 6162 obj_retry.go:303] Retry \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5kt46_openshift-ovn-kubernetes(f5274608-0c76-48d9-949d-53254df99b83)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.821345 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c377f1740ee06c0676a0f786ba5b15eac00cbeebf162a9ad465a505c3183a652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.836457 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.850291 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.850338 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.850348 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.850365 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.850378 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:53Z","lastTransitionTime":"2025-11-25T15:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.853362 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.867016 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.878515 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.888767 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.900897 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.911120 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-metrics-certs\") pod \"network-metrics-daemon-z6lnx\" (UID: \"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\") " pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:35:53 crc kubenswrapper[4704]: E1125 15:35:53.911261 4704 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:35:53 crc kubenswrapper[4704]: E1125 15:35:53.911343 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-metrics-certs podName:b9cf8fad-2f72-4a94-958b-dd58fc76f4df nodeName:}" failed. No retries permitted until 2025-11-25 15:35:55.911326899 +0000 UTC m=+42.179600680 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-metrics-certs") pod "network-metrics-daemon-z6lnx" (UID: "b9cf8fad-2f72-4a94-958b-dd58fc76f4df") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.916225 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.933320 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.948556 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.952627 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.952678 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.952690 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.952709 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.952723 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:53Z","lastTransitionTime":"2025-11-25T15:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.966345 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.979447 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801589f-7db3-4c55-9232-29b5417286d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b199609f9051ae77612ae12ecfe0c54c2cd7c008701075dc17cf4ff0f6d7b021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://313f477ea11a405cea82c897651b0950eb648d63a44b5292a774c5d943f21483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kct97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:53 crc kubenswrapper[4704]: I1125 15:35:53.992096 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6lnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6lnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:53Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.055710 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.055763 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.055774 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.055805 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.055817 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:54Z","lastTransitionTime":"2025-11-25T15:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.158360 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.158406 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.158418 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.158435 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.158448 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:54Z","lastTransitionTime":"2025-11-25T15:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.262074 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.262127 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.262139 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.262161 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.262175 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:54Z","lastTransitionTime":"2025-11-25T15:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.365310 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.365361 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.365370 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.365392 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.365402 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:54Z","lastTransitionTime":"2025-11-25T15:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.416282 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.416308 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:35:54 crc kubenswrapper[4704]: E1125 15:35:54.416509 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:35:54 crc kubenswrapper[4704]: E1125 15:35:54.416436 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.437387 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.450672 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.462608 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.471806 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.471847 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.471859 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.471877 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.471888 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:54Z","lastTransitionTime":"2025-11-25T15:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.485667 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d044b149e11472a38ca4b46510d8f96fbe5e2335925211ebd02319e97fa76d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d044b149e11472a38ca4b46510d8f96fbe5e2335925211ebd02319e97fa76d70\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"er, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z]\\\\nI1125 15:35:52.084017 6162 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1125 15:35:52.084511 6162 obj_retry.go:303] Retry \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5kt46_openshift-ovn-kubernetes(f5274608-0c76-48d9-949d-53254df99b83)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.497263 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.509641 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.525728 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c377f1740ee06c0676a0f786ba5b15eac00cbeebf162a9ad465a505c3183a652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.539687 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.555930 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.565748 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.574331 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.574379 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.574394 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.574411 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.574424 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:54Z","lastTransitionTime":"2025-11-25T15:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.578466 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.591367 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801589f-7db3-4c55-9232-29b5417286d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b199609f9051ae77612ae12ecfe0c54c2cd7c008701075dc17cf4ff0f6d7b021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://313f477ea11a405cea82c897651b0950eb648d63a44b5292a774c5d943f21483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kct97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.603199 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6lnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6lnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.617659 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.630387 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.643145 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.656932 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:54Z is after 2025-08-24T17:21:41Z" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.677365 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.677410 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.677421 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.677440 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.677453 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:54Z","lastTransitionTime":"2025-11-25T15:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.780247 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.780296 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.780306 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.780323 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.780335 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:54Z","lastTransitionTime":"2025-11-25T15:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.883154 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.883205 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.883221 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.883241 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.883253 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:54Z","lastTransitionTime":"2025-11-25T15:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.986139 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.986187 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.986198 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.986216 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:54 crc kubenswrapper[4704]: I1125 15:35:54.986227 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:54Z","lastTransitionTime":"2025-11-25T15:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.088689 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.088749 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.088759 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.088776 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.088808 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:55Z","lastTransitionTime":"2025-11-25T15:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.192181 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.192226 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.192239 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.192257 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.192271 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:55Z","lastTransitionTime":"2025-11-25T15:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.295411 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.295467 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.295475 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.295490 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.295500 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:55Z","lastTransitionTime":"2025-11-25T15:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.397392 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.397442 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.397454 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.397469 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.397481 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:55Z","lastTransitionTime":"2025-11-25T15:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.415853 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.415872 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:35:55 crc kubenswrapper[4704]: E1125 15:35:55.416011 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:35:55 crc kubenswrapper[4704]: E1125 15:35:55.416097 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.500221 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.500265 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.500274 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.500294 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.500308 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:55Z","lastTransitionTime":"2025-11-25T15:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.602403 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.602457 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.602470 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.602490 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.602504 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:55Z","lastTransitionTime":"2025-11-25T15:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.705450 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.705505 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.705514 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.705532 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.705543 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:55Z","lastTransitionTime":"2025-11-25T15:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.812177 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.812233 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.812250 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.812272 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.812289 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:55Z","lastTransitionTime":"2025-11-25T15:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.916522 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.916627 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.916650 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.916682 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.916704 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:55Z","lastTransitionTime":"2025-11-25T15:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:55 crc kubenswrapper[4704]: I1125 15:35:55.933177 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-metrics-certs\") pod \"network-metrics-daemon-z6lnx\" (UID: \"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\") " pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:35:55 crc kubenswrapper[4704]: E1125 15:35:55.933366 4704 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:35:55 crc kubenswrapper[4704]: E1125 15:35:55.933428 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-metrics-certs podName:b9cf8fad-2f72-4a94-958b-dd58fc76f4df nodeName:}" failed. No retries permitted until 2025-11-25 15:35:59.933410925 +0000 UTC m=+46.201684706 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-metrics-certs") pod "network-metrics-daemon-z6lnx" (UID: "b9cf8fad-2f72-4a94-958b-dd58fc76f4df") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.019140 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.019201 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.019210 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.019228 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.019240 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:56Z","lastTransitionTime":"2025-11-25T15:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.122760 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.122840 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.122859 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.122887 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.122906 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:56Z","lastTransitionTime":"2025-11-25T15:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.225418 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.225466 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.225478 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.225496 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.225510 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:56Z","lastTransitionTime":"2025-11-25T15:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.328955 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.329006 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.329017 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.329039 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.329052 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:56Z","lastTransitionTime":"2025-11-25T15:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.415746 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.415844 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:35:56 crc kubenswrapper[4704]: E1125 15:35:56.415956 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:35:56 crc kubenswrapper[4704]: E1125 15:35:56.416030 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.431825 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.431877 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.431889 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.431910 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.431923 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:56Z","lastTransitionTime":"2025-11-25T15:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.535566 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.535609 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.535620 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.535708 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.535724 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:56Z","lastTransitionTime":"2025-11-25T15:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.638938 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.638993 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.639005 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.639027 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.639040 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:56Z","lastTransitionTime":"2025-11-25T15:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.741841 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.741892 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.741903 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.741922 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.741935 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:56Z","lastTransitionTime":"2025-11-25T15:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.844322 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.844375 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.844384 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.844404 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.844416 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:56Z","lastTransitionTime":"2025-11-25T15:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.946687 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.946749 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.946769 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.946842 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:56 crc kubenswrapper[4704]: I1125 15:35:56.946869 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:56Z","lastTransitionTime":"2025-11-25T15:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.050100 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.050189 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.050208 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.050233 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.050246 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:57Z","lastTransitionTime":"2025-11-25T15:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.152993 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.153056 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.153072 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.153094 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.153108 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:57Z","lastTransitionTime":"2025-11-25T15:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.255852 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.255901 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.255919 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.255949 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.255973 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:57Z","lastTransitionTime":"2025-11-25T15:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.359560 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.359628 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.359637 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.359656 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.359667 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:57Z","lastTransitionTime":"2025-11-25T15:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.416362 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.416420 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:35:57 crc kubenswrapper[4704]: E1125 15:35:57.416691 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:35:57 crc kubenswrapper[4704]: E1125 15:35:57.417134 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.462556 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.462621 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.462633 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.462677 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.462691 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:57Z","lastTransitionTime":"2025-11-25T15:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.565421 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.565474 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.565486 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.565521 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.565539 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:57Z","lastTransitionTime":"2025-11-25T15:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.668453 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.668502 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.668513 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.668531 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.668545 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:57Z","lastTransitionTime":"2025-11-25T15:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.771272 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.771338 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.771348 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.771364 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.771374 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:57Z","lastTransitionTime":"2025-11-25T15:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.874037 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.874076 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.874084 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.874102 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.874115 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:57Z","lastTransitionTime":"2025-11-25T15:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.976581 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.976671 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.976686 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.976703 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:57 crc kubenswrapper[4704]: I1125 15:35:57.976713 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:57Z","lastTransitionTime":"2025-11-25T15:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.079163 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.079226 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.079238 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.079256 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.079267 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:58Z","lastTransitionTime":"2025-11-25T15:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.182293 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.182340 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.182350 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.182366 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.182377 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:58Z","lastTransitionTime":"2025-11-25T15:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.284360 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.284428 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.284439 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.284461 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.284473 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:58Z","lastTransitionTime":"2025-11-25T15:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.390477 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.390582 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.390685 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.390721 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.390745 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:58Z","lastTransitionTime":"2025-11-25T15:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.416459 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.416537 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:35:58 crc kubenswrapper[4704]: E1125 15:35:58.416656 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:35:58 crc kubenswrapper[4704]: E1125 15:35:58.416802 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.493296 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.493343 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.493354 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.493371 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.493381 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:58Z","lastTransitionTime":"2025-11-25T15:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.596691 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.596738 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.596746 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.596764 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.596775 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:58Z","lastTransitionTime":"2025-11-25T15:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.699583 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.699646 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.699656 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.699676 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.699687 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:58Z","lastTransitionTime":"2025-11-25T15:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.802806 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.802852 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.802866 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.802884 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.802900 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:58Z","lastTransitionTime":"2025-11-25T15:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.906528 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.906591 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.906602 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.906621 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:58 crc kubenswrapper[4704]: I1125 15:35:58.906667 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:58Z","lastTransitionTime":"2025-11-25T15:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.009166 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.009215 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.009225 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.009246 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.009257 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:59Z","lastTransitionTime":"2025-11-25T15:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.111600 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.111662 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.111673 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.111695 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.111707 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:59Z","lastTransitionTime":"2025-11-25T15:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.214739 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.214800 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.214816 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.214836 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.214847 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:59Z","lastTransitionTime":"2025-11-25T15:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.317513 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.317578 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.317590 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.317606 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.317617 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:59Z","lastTransitionTime":"2025-11-25T15:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.415521 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.415641 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:35:59 crc kubenswrapper[4704]: E1125 15:35:59.415720 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:35:59 crc kubenswrapper[4704]: E1125 15:35:59.415831 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.421089 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.421137 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.421148 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.421170 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.421192 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:59Z","lastTransitionTime":"2025-11-25T15:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.524326 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.524451 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.524466 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.524485 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.524495 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:59Z","lastTransitionTime":"2025-11-25T15:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.627234 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.627293 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.627308 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.627330 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.627343 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:59Z","lastTransitionTime":"2025-11-25T15:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.729846 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.729886 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.729895 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.729910 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.729919 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:59Z","lastTransitionTime":"2025-11-25T15:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.832520 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.832580 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.832595 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.832614 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.832628 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:59Z","lastTransitionTime":"2025-11-25T15:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.935777 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.935841 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.935852 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.935870 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.935882 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:35:59Z","lastTransitionTime":"2025-11-25T15:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:35:59 crc kubenswrapper[4704]: I1125 15:35:59.976439 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-metrics-certs\") pod \"network-metrics-daemon-z6lnx\" (UID: \"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\") " pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:35:59 crc kubenswrapper[4704]: E1125 15:35:59.976587 4704 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:35:59 crc kubenswrapper[4704]: E1125 15:35:59.976657 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-metrics-certs podName:b9cf8fad-2f72-4a94-958b-dd58fc76f4df nodeName:}" failed. No retries permitted until 2025-11-25 15:36:07.9766349 +0000 UTC m=+54.244908681 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-metrics-certs") pod "network-metrics-daemon-z6lnx" (UID: "b9cf8fad-2f72-4a94-958b-dd58fc76f4df") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.039090 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.039145 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.039156 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.039174 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.039184 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:00Z","lastTransitionTime":"2025-11-25T15:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.141982 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.142030 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.142040 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.142066 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.142080 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:00Z","lastTransitionTime":"2025-11-25T15:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.245398 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.245439 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.245449 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.245464 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.245475 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:00Z","lastTransitionTime":"2025-11-25T15:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.267536 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.267594 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.267609 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.267631 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.267643 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:00Z","lastTransitionTime":"2025-11-25T15:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:00 crc kubenswrapper[4704]: E1125 15:36:00.280848 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:00Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.285098 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.285157 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.285179 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.285204 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.285220 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:00Z","lastTransitionTime":"2025-11-25T15:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:00 crc kubenswrapper[4704]: E1125 15:36:00.302145 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:00Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.307107 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.307152 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.307164 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.307182 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.307196 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:00Z","lastTransitionTime":"2025-11-25T15:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:00 crc kubenswrapper[4704]: E1125 15:36:00.321336 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:00Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.325701 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.325764 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.325779 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.325823 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.325840 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:00Z","lastTransitionTime":"2025-11-25T15:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:00 crc kubenswrapper[4704]: E1125 15:36:00.340043 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:00Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.349679 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.349732 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.349742 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.349763 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.349776 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:00Z","lastTransitionTime":"2025-11-25T15:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:00 crc kubenswrapper[4704]: E1125 15:36:00.363157 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:00Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:00 crc kubenswrapper[4704]: E1125 15:36:00.363381 4704 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.365270 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.365301 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.365313 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.365331 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.365352 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:00Z","lastTransitionTime":"2025-11-25T15:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.416133 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:00 crc kubenswrapper[4704]: E1125 15:36:00.416284 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.416284 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:00 crc kubenswrapper[4704]: E1125 15:36:00.416375 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.467783 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.467873 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.467885 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.467902 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.467913 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:00Z","lastTransitionTime":"2025-11-25T15:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.571258 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.571323 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.571342 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.571361 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.571377 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:00Z","lastTransitionTime":"2025-11-25T15:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.674716 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.674771 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.674801 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.674824 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.674839 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:00Z","lastTransitionTime":"2025-11-25T15:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.777089 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.777133 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.777142 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.777162 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.777175 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:00Z","lastTransitionTime":"2025-11-25T15:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.879672 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.879714 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.879724 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.879740 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.879750 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:00Z","lastTransitionTime":"2025-11-25T15:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.982601 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.982677 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.982712 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.982733 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:00 crc kubenswrapper[4704]: I1125 15:36:00.982746 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:00Z","lastTransitionTime":"2025-11-25T15:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.085451 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.085516 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.085528 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.085542 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.085552 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:01Z","lastTransitionTime":"2025-11-25T15:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.188116 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.188157 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.188168 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.188186 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.188197 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:01Z","lastTransitionTime":"2025-11-25T15:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.290411 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.290442 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.290450 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.290466 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.290476 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:01Z","lastTransitionTime":"2025-11-25T15:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.392437 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.392783 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.392880 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.392951 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.393018 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:01Z","lastTransitionTime":"2025-11-25T15:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.416080 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:01 crc kubenswrapper[4704]: E1125 15:36:01.416251 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.416502 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:01 crc kubenswrapper[4704]: E1125 15:36:01.416827 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.496086 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.496136 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.496149 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.496164 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.496176 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:01Z","lastTransitionTime":"2025-11-25T15:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.598680 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.598725 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.598736 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.598751 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.598762 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:01Z","lastTransitionTime":"2025-11-25T15:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.702230 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.702441 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.702466 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.702495 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.702515 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:01Z","lastTransitionTime":"2025-11-25T15:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.805766 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.806183 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.806272 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.806357 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.806419 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:01Z","lastTransitionTime":"2025-11-25T15:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.910252 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.910295 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.910305 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.910322 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:01 crc kubenswrapper[4704]: I1125 15:36:01.910335 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:01Z","lastTransitionTime":"2025-11-25T15:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.013365 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.013432 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.013446 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.013469 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.013483 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:02Z","lastTransitionTime":"2025-11-25T15:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.116986 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.117331 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.117435 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.117503 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.117559 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:02Z","lastTransitionTime":"2025-11-25T15:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.220867 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.221898 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.221936 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.221962 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.221980 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:02Z","lastTransitionTime":"2025-11-25T15:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.324943 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.324994 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.325010 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.325035 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.325053 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:02Z","lastTransitionTime":"2025-11-25T15:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.416417 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:02 crc kubenswrapper[4704]: E1125 15:36:02.416632 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.416721 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:02 crc kubenswrapper[4704]: E1125 15:36:02.416977 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.430819 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.431120 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.431192 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.431259 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.431323 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:02Z","lastTransitionTime":"2025-11-25T15:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.535825 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.536148 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.536269 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.536375 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.536454 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:02Z","lastTransitionTime":"2025-11-25T15:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.640252 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.640293 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.640304 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.640324 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.640336 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:02Z","lastTransitionTime":"2025-11-25T15:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.743657 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.743749 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.743764 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.743784 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.743846 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:02Z","lastTransitionTime":"2025-11-25T15:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.847286 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.847328 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.847339 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.847357 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.847367 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:02Z","lastTransitionTime":"2025-11-25T15:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.950896 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.950949 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.950960 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.950983 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:02 crc kubenswrapper[4704]: I1125 15:36:02.951000 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:02Z","lastTransitionTime":"2025-11-25T15:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.054526 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.054583 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.054594 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.054611 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.054622 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:03Z","lastTransitionTime":"2025-11-25T15:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.158520 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.158574 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.158584 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.158600 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.158611 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:03Z","lastTransitionTime":"2025-11-25T15:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.262042 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.262082 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.262094 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.262115 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.262132 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:03Z","lastTransitionTime":"2025-11-25T15:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.365281 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.365341 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.365355 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.365377 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.365391 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:03Z","lastTransitionTime":"2025-11-25T15:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.416321 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.416422 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:03 crc kubenswrapper[4704]: E1125 15:36:03.416482 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:36:03 crc kubenswrapper[4704]: E1125 15:36:03.416647 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.467833 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.467887 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.467901 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.467921 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.467935 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:03Z","lastTransitionTime":"2025-11-25T15:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.570214 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.570279 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.570298 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.570320 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.570332 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:03Z","lastTransitionTime":"2025-11-25T15:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.673399 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.673457 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.673479 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.673502 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.673517 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:03Z","lastTransitionTime":"2025-11-25T15:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.777058 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.777115 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.777125 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.777146 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.777158 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:03Z","lastTransitionTime":"2025-11-25T15:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.880924 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.881034 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.881062 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.881097 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.881127 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:03Z","lastTransitionTime":"2025-11-25T15:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.984439 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.984502 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.984517 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.984537 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:03 crc kubenswrapper[4704]: I1125 15:36:03.984548 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:03Z","lastTransitionTime":"2025-11-25T15:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.087194 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.087263 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.087283 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.087320 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.087339 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:04Z","lastTransitionTime":"2025-11-25T15:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.191188 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.191248 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.191261 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.191279 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.191290 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:04Z","lastTransitionTime":"2025-11-25T15:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.293878 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.293928 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.293940 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.293959 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.293977 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:04Z","lastTransitionTime":"2025-11-25T15:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.396892 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.396944 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.396957 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.396975 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.396985 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:04Z","lastTransitionTime":"2025-11-25T15:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.415476 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.415648 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:04 crc kubenswrapper[4704]: E1125 15:36:04.415834 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:04 crc kubenswrapper[4704]: E1125 15:36:04.416042 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.430859 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801589f-7db3-4c55-9232-29b5417286d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b199609f9051ae77612ae12ecfe0c54c2cd7c008701075dc17cf4ff0f6d7b021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://313f477ea11a405cea82c897651b0950eb648d63a44b5292a774c5d943f21483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kct97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.443939 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6lnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6lnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.456289 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.468080 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.478846 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.490974 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.500059 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.500112 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.500123 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.500139 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.500386 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:04Z","lastTransitionTime":"2025-11-25T15:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.510565 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.531808 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.546154 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.567545 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d044b149e11472a38ca4b46510d8f96fbe5e2335925211ebd02319e97fa76d70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d044b149e11472a38ca4b46510d8f96fbe5e2335925211ebd02319e97fa76d70\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"er, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z]\\\\nI1125 15:35:52.084017 6162 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1125 15:35:52.084511 6162 obj_retry.go:303] Retry \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5kt46_openshift-ovn-kubernetes(f5274608-0c76-48d9-949d-53254df99b83)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.583938 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.597745 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.602654 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.602701 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.602711 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.602728 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.602739 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:04Z","lastTransitionTime":"2025-11-25T15:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.614719 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c377f1740ee06c0676a0f786ba5b15eac00cbeebf162a9ad465a505c3183a652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.628900 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.645580 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.656501 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.668429 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:04Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.705592 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.705649 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.705667 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.705739 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.705757 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:04Z","lastTransitionTime":"2025-11-25T15:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.808907 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.809001 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.809014 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.809037 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.809050 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:04Z","lastTransitionTime":"2025-11-25T15:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.911723 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.911840 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.911856 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.911878 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:04 crc kubenswrapper[4704]: I1125 15:36:04.911891 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:04Z","lastTransitionTime":"2025-11-25T15:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.016027 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.016074 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.016083 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.016108 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.016119 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:05Z","lastTransitionTime":"2025-11-25T15:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.119440 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.119538 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.119561 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.119595 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.119614 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:05Z","lastTransitionTime":"2025-11-25T15:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.222590 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.222653 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.222672 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.222696 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.222715 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:05Z","lastTransitionTime":"2025-11-25T15:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.325899 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.325985 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.325998 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.326021 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.326038 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:05Z","lastTransitionTime":"2025-11-25T15:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.416300 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.416386 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:05 crc kubenswrapper[4704]: E1125 15:36:05.416465 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:36:05 crc kubenswrapper[4704]: E1125 15:36:05.416667 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.428825 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.428873 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.428905 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.428929 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.428946 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:05Z","lastTransitionTime":"2025-11-25T15:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.532730 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.532813 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.532826 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.532849 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.532861 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:05Z","lastTransitionTime":"2025-11-25T15:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.635457 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.635499 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.635512 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.635532 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.635545 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:05Z","lastTransitionTime":"2025-11-25T15:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.738493 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.738543 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.738556 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.738580 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.738596 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:05Z","lastTransitionTime":"2025-11-25T15:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.841140 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.841195 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.841208 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.841229 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.841248 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:05Z","lastTransitionTime":"2025-11-25T15:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.944812 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.944870 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.944884 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.944909 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:05 crc kubenswrapper[4704]: I1125 15:36:05.944923 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:05Z","lastTransitionTime":"2025-11-25T15:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.049523 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.049600 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.049626 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.049654 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.049673 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:06Z","lastTransitionTime":"2025-11-25T15:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.142882 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.143050 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:06 crc kubenswrapper[4704]: E1125 15:36:06.143177 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:36:38.143120807 +0000 UTC m=+84.411394628 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:36:06 crc kubenswrapper[4704]: E1125 15:36:06.143151 4704 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:36:06 crc kubenswrapper[4704]: E1125 15:36:06.143296 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:36:38.143269612 +0000 UTC m=+84.411543423 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.143332 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:06 crc kubenswrapper[4704]: E1125 15:36:06.143531 4704 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:36:06 crc kubenswrapper[4704]: E1125 15:36:06.143638 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:36:38.143622182 +0000 UTC m=+84.411896003 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.153002 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.153063 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.153088 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.153120 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.153142 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:06Z","lastTransitionTime":"2025-11-25T15:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.244396 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.244504 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:06 crc kubenswrapper[4704]: E1125 15:36:06.244699 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:36:06 crc kubenswrapper[4704]: E1125 15:36:06.244759 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:36:06 crc kubenswrapper[4704]: E1125 15:36:06.244782 4704 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:36:06 crc kubenswrapper[4704]: E1125 15:36:06.244700 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:36:06 crc kubenswrapper[4704]: E1125 15:36:06.244978 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 15:36:38.244946571 +0000 UTC m=+84.513220392 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:36:06 crc kubenswrapper[4704]: E1125 15:36:06.244997 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:36:06 crc kubenswrapper[4704]: E1125 15:36:06.245026 4704 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:36:06 crc kubenswrapper[4704]: E1125 15:36:06.245123 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 15:36:38.245090115 +0000 UTC m=+84.513363936 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.255821 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.255859 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.255870 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.255891 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.255905 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:06Z","lastTransitionTime":"2025-11-25T15:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.359011 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.359063 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.359074 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.359094 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.359107 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:06Z","lastTransitionTime":"2025-11-25T15:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.415847 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.416074 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:06 crc kubenswrapper[4704]: E1125 15:36:06.416205 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:06 crc kubenswrapper[4704]: E1125 15:36:06.416483 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.462222 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.462277 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.462290 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.462307 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.462320 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:06Z","lastTransitionTime":"2025-11-25T15:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.565927 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.565972 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.565986 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.566004 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.566017 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:06Z","lastTransitionTime":"2025-11-25T15:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.669257 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.669299 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.669310 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.669329 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.669343 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:06Z","lastTransitionTime":"2025-11-25T15:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.772167 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.772213 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.772225 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.772243 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.772256 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:06Z","lastTransitionTime":"2025-11-25T15:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.875497 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.875538 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.875550 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.875568 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.875581 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:06Z","lastTransitionTime":"2025-11-25T15:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.979306 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.979361 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.979372 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.979390 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:06 crc kubenswrapper[4704]: I1125 15:36:06.979405 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:06Z","lastTransitionTime":"2025-11-25T15:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.082115 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.082168 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.082181 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.082202 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.082214 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:07Z","lastTransitionTime":"2025-11-25T15:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.185503 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.185543 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.185552 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.185566 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.185576 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:07Z","lastTransitionTime":"2025-11-25T15:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.288410 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.288774 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.288888 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.288979 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.289051 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:07Z","lastTransitionTime":"2025-11-25T15:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.391720 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.391774 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.391806 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.391825 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.391839 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:07Z","lastTransitionTime":"2025-11-25T15:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.415979 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.416086 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:07 crc kubenswrapper[4704]: E1125 15:36:07.416117 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:07 crc kubenswrapper[4704]: E1125 15:36:07.416266 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.495249 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.495310 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.495323 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.495349 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.495362 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:07Z","lastTransitionTime":"2025-11-25T15:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.598454 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.598502 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.598516 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.598536 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.598550 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:07Z","lastTransitionTime":"2025-11-25T15:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.700900 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.701308 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.701394 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.701519 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.701588 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:07Z","lastTransitionTime":"2025-11-25T15:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.803683 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.803728 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.803737 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.803752 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.803761 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:07Z","lastTransitionTime":"2025-11-25T15:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.906249 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.906302 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.906313 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.906335 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:07 crc kubenswrapper[4704]: I1125 15:36:07.906349 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:07Z","lastTransitionTime":"2025-11-25T15:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.009916 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.009980 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.009991 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.010008 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.010019 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:08Z","lastTransitionTime":"2025-11-25T15:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.067564 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-metrics-certs\") pod \"network-metrics-daemon-z6lnx\" (UID: \"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\") " pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:08 crc kubenswrapper[4704]: E1125 15:36:08.067830 4704 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:36:08 crc kubenswrapper[4704]: E1125 15:36:08.067973 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-metrics-certs podName:b9cf8fad-2f72-4a94-958b-dd58fc76f4df nodeName:}" failed. No retries permitted until 2025-11-25 15:36:24.067943382 +0000 UTC m=+70.336217163 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-metrics-certs") pod "network-metrics-daemon-z6lnx" (UID: "b9cf8fad-2f72-4a94-958b-dd58fc76f4df") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.113184 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.113246 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.113258 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.113281 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.113300 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:08Z","lastTransitionTime":"2025-11-25T15:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.216453 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.216494 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.216503 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.216519 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.216531 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:08Z","lastTransitionTime":"2025-11-25T15:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.320098 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.320145 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.320157 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.320174 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.320185 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:08Z","lastTransitionTime":"2025-11-25T15:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.416369 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.416401 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:08 crc kubenswrapper[4704]: E1125 15:36:08.416558 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:08 crc kubenswrapper[4704]: E1125 15:36:08.416734 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.417983 4704 scope.go:117] "RemoveContainer" containerID="d044b149e11472a38ca4b46510d8f96fbe5e2335925211ebd02319e97fa76d70" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.422867 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.422898 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.422915 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.422933 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.422948 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:08Z","lastTransitionTime":"2025-11-25T15:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.528238 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.528746 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.528758 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.528776 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.528802 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:08Z","lastTransitionTime":"2025-11-25T15:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.693817 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.693876 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.693890 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.693910 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.693933 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:08Z","lastTransitionTime":"2025-11-25T15:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.776100 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5kt46_f5274608-0c76-48d9-949d-53254df99b83/ovnkube-controller/1.log" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.779450 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" event={"ID":"f5274608-0c76-48d9-949d-53254df99b83","Type":"ContainerStarted","Data":"66e1aead4358c4e3bd9a2beb6aba48d99e5ab391a8e715a5a5eaa67f6f6bed8f"} Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.780126 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.792338 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.796594 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.796641 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.796651 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.796666 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.796677 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:08Z","lastTransitionTime":"2025-11-25T15:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.805574 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.819614 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.839755 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.856295 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.872858 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.886510 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801589f-7db3-4c55-9232-29b5417286d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b199609f9051ae77612ae12ecfe0c54c2cd7c008701075dc17cf4ff0f6d7b021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://313f477ea11a405cea82c897651b0950eb648d63a44b5292a774c5d943f21483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kct97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.898754 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.898807 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.898817 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.898833 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.898844 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:08Z","lastTransitionTime":"2025-11-25T15:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.902562 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6lnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6lnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.923000 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.937382 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.954304 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.980996 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66e1aead4358c4e3bd9a2beb6aba48d99e5ab391a8e715a5a5eaa67f6f6bed8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d044b149e11472a38ca4b46510d8f96fbe5e2335925211ebd02319e97fa76d70\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"er, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z]\\\\nI1125 15:35:52.084017 6162 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1125 15:35:52.084511 6162 obj_retry.go:303] Retry \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:08 crc kubenswrapper[4704]: I1125 15:36:08.996536 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:08Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.001294 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.001352 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.001367 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.001386 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.001399 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:09Z","lastTransitionTime":"2025-11-25T15:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.012073 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.025601 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.037390 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.056758 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c377f1740ee06c0676a0f786ba5b15eac00cbeebf162a9ad465a505c3183a652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.104050 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.104103 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.104116 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.104138 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.104151 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:09Z","lastTransitionTime":"2025-11-25T15:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.206497 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.206533 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.206542 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.206556 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.206566 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:09Z","lastTransitionTime":"2025-11-25T15:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.310253 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.310289 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.310301 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.310324 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.310335 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:09Z","lastTransitionTime":"2025-11-25T15:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.348068 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.360659 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.371838 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.386057 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.400112 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.413450 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.413498 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.413513 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.413535 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.413548 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:09Z","lastTransitionTime":"2025-11-25T15:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.415317 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.415360 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.415347 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:09 crc kubenswrapper[4704]: E1125 15:36:09.415477 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:36:09 crc kubenswrapper[4704]: E1125 15:36:09.415606 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.427388 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801589f-7db3-4c55-9232-29b5417286d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b199609f9051ae77612ae12ecfe0c54c2cd7c008701075dc17cf4ff0f6d7b021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://313f477ea11a405cea82c897651b0950eb648d63a44b5292a774c5d943f21483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kct97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.439404 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6lnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6lnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.462511 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.476183 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.490843 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.511149 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66e1aead4358c4e3bd9a2beb6aba48d99e5ab391a8e715a5a5eaa67f6f6bed8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d044b149e11472a38ca4b46510d8f96fbe5e2335925211ebd02319e97fa76d70\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"er, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z]\\\\nI1125 15:35:52.084017 6162 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1125 15:35:52.084511 6162 obj_retry.go:303] Retry \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.515391 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.515446 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.515458 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.515481 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.515492 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:09Z","lastTransitionTime":"2025-11-25T15:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.525304 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.541564 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.554580 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.565498 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.582090 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c377f1740ee06c0676a0f786ba5b15eac00cbeebf162a9ad465a505c3183a652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.594863 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.608857 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.618640 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.618686 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.618696 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.618713 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.618729 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:09Z","lastTransitionTime":"2025-11-25T15:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.721386 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.721430 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.721441 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.721462 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.721486 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:09Z","lastTransitionTime":"2025-11-25T15:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.784150 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5kt46_f5274608-0c76-48d9-949d-53254df99b83/ovnkube-controller/2.log" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.784701 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5kt46_f5274608-0c76-48d9-949d-53254df99b83/ovnkube-controller/1.log" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.787286 4704 generic.go:334] "Generic (PLEG): container finished" podID="f5274608-0c76-48d9-949d-53254df99b83" containerID="66e1aead4358c4e3bd9a2beb6aba48d99e5ab391a8e715a5a5eaa67f6f6bed8f" exitCode=1 Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.787359 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" event={"ID":"f5274608-0c76-48d9-949d-53254df99b83","Type":"ContainerDied","Data":"66e1aead4358c4e3bd9a2beb6aba48d99e5ab391a8e715a5a5eaa67f6f6bed8f"} Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.787414 4704 scope.go:117] "RemoveContainer" containerID="d044b149e11472a38ca4b46510d8f96fbe5e2335925211ebd02319e97fa76d70" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.788728 4704 scope.go:117] "RemoveContainer" containerID="66e1aead4358c4e3bd9a2beb6aba48d99e5ab391a8e715a5a5eaa67f6f6bed8f" Nov 25 15:36:09 crc kubenswrapper[4704]: E1125 15:36:09.789013 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5kt46_openshift-ovn-kubernetes(f5274608-0c76-48d9-949d-53254df99b83)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" podUID="f5274608-0c76-48d9-949d-53254df99b83" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.817621 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.824315 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.824364 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.824374 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.824390 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.824401 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:09Z","lastTransitionTime":"2025-11-25T15:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.834407 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.846612 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.864294 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66e1aead4358c4e3bd9a2beb6aba48d99e5ab391a8e715a5a5eaa67f6f6bed8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d044b149e11472a38ca4b46510d8f96fbe5e2335925211ebd02319e97fa76d70\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"er, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:35:52Z is after 2025-08-24T17:21:41Z]\\\\nI1125 15:35:52.084017 6162 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1125 15:35:52.084511 6162 obj_retry.go:303] Retry \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66e1aead4358c4e3bd9a2beb6aba48d99e5ab391a8e715a5a5eaa67f6f6bed8f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:36:09Z\\\",\\\"message\\\":\\\"onfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.222\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1125 15:36:09.236915 6377 services_controller.go:444] Built service openshift-authentication/oauth-openshift LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1125 15:36:09.235648 6377 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97 after 0 failed attempt(s)\\\\nI1125 15:36:09.236928 6377 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97\\\\nI1125 15:36:09.236928 6377 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1125 15:36:09.236939 6377 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1125 15:36:09.236944 6377 services_controller.go:445] Built service openshift-authentication/oauth-openshift LB template configs for network=default: []services.lbConfig(nil)\\\\nI1125 15:36:09.236959 6377 services_controller.go:451] Built service openshift-authentication/oauth-open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:36:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.877904 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.891569 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762ff0fd-9b25-4158-bd68-957bbfa4298c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8b8c4cc5291456d1da07123c3cb28928c03f31896d7ecc39be043bf8b8d9ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a86e85d42ee299ff07f19e664e89971cb63b7cc1edd398c1bedf638314ee482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0b95608507532a76441cdae944fe025fdaf833fd16e62ca0851043de8bb308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978a4be974ef4e8021721a05ea6dea880a799a5ec99b653504d862eb2f1c8c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978a4be974ef4e8021721a05ea6dea880a799a5ec99b653504d862eb2f1c8c74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.906926 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.919647 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.930811 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.930863 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.930875 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.930896 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.930908 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:09Z","lastTransitionTime":"2025-11-25T15:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.930959 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.946072 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c377f1740ee06c0676a0f786ba5b15eac00cbeebf162a9ad465a505c3183a652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.964714 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.976009 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:09 crc kubenswrapper[4704]: I1125 15:36:09.990270 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:09Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.003248 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.015345 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.028329 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.032953 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.033000 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.033012 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.033030 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.033043 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:10Z","lastTransitionTime":"2025-11-25T15:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.039427 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801589f-7db3-4c55-9232-29b5417286d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b199609f9051ae77612ae12ecfe0c54c2cd7c008701075dc17cf4ff0f6d7b021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://313f477ea11a405cea82c897651b0950eb648d63a44b5292a774c5d943f21483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kct97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.050699 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6lnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6lnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.135050 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.135084 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.135093 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.135108 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.135119 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:10Z","lastTransitionTime":"2025-11-25T15:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.238422 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.238461 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.238473 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.238492 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.238505 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:10Z","lastTransitionTime":"2025-11-25T15:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.340719 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.340759 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.340776 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.340814 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.340826 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:10Z","lastTransitionTime":"2025-11-25T15:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.415715 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.415733 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:10 crc kubenswrapper[4704]: E1125 15:36:10.415891 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:10 crc kubenswrapper[4704]: E1125 15:36:10.416171 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.443371 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.443410 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.443419 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.443433 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.443445 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:10Z","lastTransitionTime":"2025-11-25T15:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.545486 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.545541 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.545549 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.545570 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.545589 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:10Z","lastTransitionTime":"2025-11-25T15:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.631710 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.631774 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.631804 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.631824 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.631838 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:10Z","lastTransitionTime":"2025-11-25T15:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:10 crc kubenswrapper[4704]: E1125 15:36:10.645731 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.655011 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.655057 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.655071 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.655091 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.655103 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:10Z","lastTransitionTime":"2025-11-25T15:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:10 crc kubenswrapper[4704]: E1125 15:36:10.668187 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.673069 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.673129 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.673140 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.673160 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.673174 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:10Z","lastTransitionTime":"2025-11-25T15:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:10 crc kubenswrapper[4704]: E1125 15:36:10.686669 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.691443 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.691490 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.691501 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.691519 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.691531 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:10Z","lastTransitionTime":"2025-11-25T15:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:10 crc kubenswrapper[4704]: E1125 15:36:10.704873 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.710928 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.711001 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.711019 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.711040 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.711054 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:10Z","lastTransitionTime":"2025-11-25T15:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:10 crc kubenswrapper[4704]: E1125 15:36:10.725149 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:10 crc kubenswrapper[4704]: E1125 15:36:10.725365 4704 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.727331 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.727387 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.727401 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.727421 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.727432 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:10Z","lastTransitionTime":"2025-11-25T15:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.792472 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5kt46_f5274608-0c76-48d9-949d-53254df99b83/ovnkube-controller/2.log" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.795565 4704 scope.go:117] "RemoveContainer" containerID="66e1aead4358c4e3bd9a2beb6aba48d99e5ab391a8e715a5a5eaa67f6f6bed8f" Nov 25 15:36:10 crc kubenswrapper[4704]: E1125 15:36:10.795739 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5kt46_openshift-ovn-kubernetes(f5274608-0c76-48d9-949d-53254df99b83)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" podUID="f5274608-0c76-48d9-949d-53254df99b83" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.807390 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.819100 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.831510 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.831561 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.831571 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.831588 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.831601 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:10Z","lastTransitionTime":"2025-11-25T15:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.834034 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.847912 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.859383 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.871655 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.881920 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801589f-7db3-4c55-9232-29b5417286d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b199609f9051ae77612ae12ecfe0c54c2cd7c008701075dc17cf4ff0f6d7b021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://313f477ea11a405cea82c897651b0950eb648d63a44b5292a774c5d943f21483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kct97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.893391 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6lnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6lnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.912003 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.926856 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.934569 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.934616 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.934625 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.934642 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.934651 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:10Z","lastTransitionTime":"2025-11-25T15:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.940056 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.958111 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66e1aead4358c4e3bd9a2beb6aba48d99e5ab391a8e715a5a5eaa67f6f6bed8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66e1aead4358c4e3bd9a2beb6aba48d99e5ab391a8e715a5a5eaa67f6f6bed8f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:36:09Z\\\",\\\"message\\\":\\\"onfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.222\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1125 15:36:09.236915 6377 services_controller.go:444] Built service openshift-authentication/oauth-openshift LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1125 15:36:09.235648 6377 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97 after 0 failed attempt(s)\\\\nI1125 15:36:09.236928 6377 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97\\\\nI1125 15:36:09.236928 6377 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1125 15:36:09.236939 6377 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1125 15:36:09.236944 6377 services_controller.go:445] Built service openshift-authentication/oauth-openshift LB template configs for network=default: []services.lbConfig(nil)\\\\nI1125 15:36:09.236959 6377 services_controller.go:451] Built service openshift-authentication/oauth-open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:36:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5kt46_openshift-ovn-kubernetes(f5274608-0c76-48d9-949d-53254df99b83)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.970473 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.982888 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762ff0fd-9b25-4158-bd68-957bbfa4298c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8b8c4cc5291456d1da07123c3cb28928c03f31896d7ecc39be043bf8b8d9ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a86e85d42ee299ff07f19e664e89971cb63b7cc1edd398c1bedf638314ee482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0b95608507532a76441cdae944fe025fdaf833fd16e62ca0851043de8bb308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978a4be974ef4e8021721a05ea6dea880a799a5ec99b653504d862eb2f1c8c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978a4be974ef4e8021721a05ea6dea880a799a5ec99b653504d862eb2f1c8c74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:10 crc kubenswrapper[4704]: I1125 15:36:10.996782 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:10Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.008470 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.018627 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.035735 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c377f1740ee06c0676a0f786ba5b15eac00cbeebf162a9ad465a505c3183a652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:11Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.037779 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.037860 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.037877 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.037900 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.037921 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:11Z","lastTransitionTime":"2025-11-25T15:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.141431 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.141518 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.141532 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.141549 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.141562 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:11Z","lastTransitionTime":"2025-11-25T15:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.244223 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.244284 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.244297 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.244318 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.244331 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:11Z","lastTransitionTime":"2025-11-25T15:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.347238 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.347284 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.347293 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.347312 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.347323 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:11Z","lastTransitionTime":"2025-11-25T15:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.415922 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.415928 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:11 crc kubenswrapper[4704]: E1125 15:36:11.416091 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:36:11 crc kubenswrapper[4704]: E1125 15:36:11.416344 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.450178 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.450224 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.450237 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.450255 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.450264 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:11Z","lastTransitionTime":"2025-11-25T15:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.552969 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.553008 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.553020 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.553037 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.553048 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:11Z","lastTransitionTime":"2025-11-25T15:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.656018 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.656061 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.656072 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.656093 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.656107 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:11Z","lastTransitionTime":"2025-11-25T15:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.758338 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.758415 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.758426 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.758441 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.758452 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:11Z","lastTransitionTime":"2025-11-25T15:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.861412 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.861462 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.861473 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.861491 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.861502 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:11Z","lastTransitionTime":"2025-11-25T15:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.963854 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.963919 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.963931 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.963946 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:11 crc kubenswrapper[4704]: I1125 15:36:11.963955 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:11Z","lastTransitionTime":"2025-11-25T15:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.066861 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.066947 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.066957 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.066990 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.067001 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:12Z","lastTransitionTime":"2025-11-25T15:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.169247 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.169291 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.169300 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.169315 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.169327 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:12Z","lastTransitionTime":"2025-11-25T15:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.272030 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.272069 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.272077 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.272095 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.272106 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:12Z","lastTransitionTime":"2025-11-25T15:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.374727 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.374780 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.374806 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.374827 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.374839 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:12Z","lastTransitionTime":"2025-11-25T15:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.415457 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.415727 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:12 crc kubenswrapper[4704]: E1125 15:36:12.415929 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:12 crc kubenswrapper[4704]: E1125 15:36:12.416084 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.476810 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.476861 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.476876 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.476892 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.476902 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:12Z","lastTransitionTime":"2025-11-25T15:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.579976 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.580035 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.580047 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.580067 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.580080 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:12Z","lastTransitionTime":"2025-11-25T15:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.682697 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.682783 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.682819 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.682839 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.682851 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:12Z","lastTransitionTime":"2025-11-25T15:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.785159 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.785228 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.785252 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.785277 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.785298 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:12Z","lastTransitionTime":"2025-11-25T15:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.887538 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.887597 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.887622 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.887687 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.887711 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:12Z","lastTransitionTime":"2025-11-25T15:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.990505 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.990562 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.990584 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.990607 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:12 crc kubenswrapper[4704]: I1125 15:36:12.990626 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:12Z","lastTransitionTime":"2025-11-25T15:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.098619 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.098670 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.098683 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.098702 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.098715 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:13Z","lastTransitionTime":"2025-11-25T15:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.201907 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.201968 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.201980 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.202005 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.202020 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:13Z","lastTransitionTime":"2025-11-25T15:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.304644 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.304706 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.304724 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.304745 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.304758 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:13Z","lastTransitionTime":"2025-11-25T15:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.408220 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.408264 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.408273 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.408291 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.408302 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:13Z","lastTransitionTime":"2025-11-25T15:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.415867 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.415873 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:13 crc kubenswrapper[4704]: E1125 15:36:13.416053 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:36:13 crc kubenswrapper[4704]: E1125 15:36:13.416080 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.511722 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.511774 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.511817 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.511839 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.511850 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:13Z","lastTransitionTime":"2025-11-25T15:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.614443 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.614754 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.614768 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.614828 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.614843 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:13Z","lastTransitionTime":"2025-11-25T15:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.718924 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.718980 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.718995 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.719015 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.719027 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:13Z","lastTransitionTime":"2025-11-25T15:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.822188 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.822248 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.822262 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.822280 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.822292 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:13Z","lastTransitionTime":"2025-11-25T15:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.925233 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.925281 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.925291 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.925308 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:13 crc kubenswrapper[4704]: I1125 15:36:13.925322 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:13Z","lastTransitionTime":"2025-11-25T15:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.029529 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.029589 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.029601 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.029623 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.029641 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:14Z","lastTransitionTime":"2025-11-25T15:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.132287 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.132322 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.132331 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.132347 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.132356 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:14Z","lastTransitionTime":"2025-11-25T15:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.237372 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.237432 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.237444 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.237464 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.237475 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:14Z","lastTransitionTime":"2025-11-25T15:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.339523 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.339559 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.339750 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.339770 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.339780 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:14Z","lastTransitionTime":"2025-11-25T15:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.415611 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.415727 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:14 crc kubenswrapper[4704]: E1125 15:36:14.415901 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:14 crc kubenswrapper[4704]: E1125 15:36:14.416082 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.433248 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.443067 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.443114 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.443124 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.443142 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.443152 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:14Z","lastTransitionTime":"2025-11-25T15:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.447748 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.466728 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66e1aead4358c4e3bd9a2beb6aba48d99e5ab391a8e715a5a5eaa67f6f6bed8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66e1aead4358c4e3bd9a2beb6aba48d99e5ab391a8e715a5a5eaa67f6f6bed8f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:36:09Z\\\",\\\"message\\\":\\\"onfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.222\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1125 15:36:09.236915 6377 services_controller.go:444] Built service openshift-authentication/oauth-openshift LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1125 15:36:09.235648 6377 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97 after 0 failed attempt(s)\\\\nI1125 15:36:09.236928 6377 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97\\\\nI1125 15:36:09.236928 6377 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1125 15:36:09.236939 6377 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1125 15:36:09.236944 6377 services_controller.go:445] Built service openshift-authentication/oauth-openshift LB template configs for network=default: []services.lbConfig(nil)\\\\nI1125 15:36:09.236959 6377 services_controller.go:451] Built service openshift-authentication/oauth-open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:36:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5kt46_openshift-ovn-kubernetes(f5274608-0c76-48d9-949d-53254df99b83)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.490557 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.505229 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762ff0fd-9b25-4158-bd68-957bbfa4298c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8b8c4cc5291456d1da07123c3cb28928c03f31896d7ecc39be043bf8b8d9ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a86e85d42ee299ff07f19e664e89971cb63b7cc1edd398c1bedf638314ee482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0b95608507532a76441cdae944fe025fdaf833fd16e62ca0851043de8bb308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978a4be974ef4e8021721a05ea6dea880a799a5ec99b653504d862eb2f1c8c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978a4be974ef4e8021721a05ea6dea880a799a5ec99b653504d862eb2f1c8c74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.523450 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.537637 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.545062 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.545174 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.545188 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.545226 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.545241 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:14Z","lastTransitionTime":"2025-11-25T15:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.552106 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.569399 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c377f1740ee06c0676a0f786ba5b15eac00cbeebf162a9ad465a505c3183a652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.584203 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.594861 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.606000 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.618169 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.630820 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.644010 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.648043 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.648085 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.648095 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.648113 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.648124 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:14Z","lastTransitionTime":"2025-11-25T15:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.654069 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801589f-7db3-4c55-9232-29b5417286d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b199609f9051ae77612ae12ecfe0c54c2cd7c008701075dc17cf4ff0f6d7b021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://313f477ea11a405cea82c897651b0950eb648d63a44b5292a774c5d943f21483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kct97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.664154 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6lnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6lnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.678124 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:14Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.750428 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.750467 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.750477 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.750498 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.750516 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:14Z","lastTransitionTime":"2025-11-25T15:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.853511 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.853597 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.853609 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.853649 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.853662 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:14Z","lastTransitionTime":"2025-11-25T15:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.956847 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.956901 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.956917 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.956936 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:14 crc kubenswrapper[4704]: I1125 15:36:14.956951 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:14Z","lastTransitionTime":"2025-11-25T15:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.059958 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.060028 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.060042 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.060066 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.060081 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:15Z","lastTransitionTime":"2025-11-25T15:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.162834 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.162880 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.162892 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.162908 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.162921 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:15Z","lastTransitionTime":"2025-11-25T15:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.267437 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.267504 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.267523 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.267550 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.267569 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:15Z","lastTransitionTime":"2025-11-25T15:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.370832 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.370887 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.370899 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.370920 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.370933 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:15Z","lastTransitionTime":"2025-11-25T15:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.415833 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.415859 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:15 crc kubenswrapper[4704]: E1125 15:36:15.416080 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:15 crc kubenswrapper[4704]: E1125 15:36:15.416252 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.476903 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.476982 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.477000 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.477028 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.477050 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:15Z","lastTransitionTime":"2025-11-25T15:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.580455 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.580507 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.580517 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.580534 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.580544 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:15Z","lastTransitionTime":"2025-11-25T15:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.684023 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.684480 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.684584 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.684688 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.684782 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:15Z","lastTransitionTime":"2025-11-25T15:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.788687 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.789064 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.789130 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.789210 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.789274 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:15Z","lastTransitionTime":"2025-11-25T15:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.892408 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.892456 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.892466 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.892484 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.892496 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:15Z","lastTransitionTime":"2025-11-25T15:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.995597 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.995643 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.995654 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.995672 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:15 crc kubenswrapper[4704]: I1125 15:36:15.995684 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:15Z","lastTransitionTime":"2025-11-25T15:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.099018 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.099077 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.099093 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.099121 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.099139 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:16Z","lastTransitionTime":"2025-11-25T15:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.203222 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.203282 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.203295 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.203313 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.203329 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:16Z","lastTransitionTime":"2025-11-25T15:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.305804 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.305853 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.305869 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.305890 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.305905 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:16Z","lastTransitionTime":"2025-11-25T15:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.409018 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.409058 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.409070 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.409114 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.409147 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:16Z","lastTransitionTime":"2025-11-25T15:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.415389 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:16 crc kubenswrapper[4704]: E1125 15:36:16.415525 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.415657 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:16 crc kubenswrapper[4704]: E1125 15:36:16.415903 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.511703 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.512626 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.512701 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.512817 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.512923 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:16Z","lastTransitionTime":"2025-11-25T15:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.616153 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.616227 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.616243 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.616263 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.616301 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:16Z","lastTransitionTime":"2025-11-25T15:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.719344 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.719421 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.719432 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.719452 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:16 crc kubenswrapper[4704]: I1125 15:36:16.719464 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:16Z","lastTransitionTime":"2025-11-25T15:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.217686 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.217746 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.217763 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.217782 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.217813 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:17Z","lastTransitionTime":"2025-11-25T15:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.320848 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.321213 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.321314 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.321426 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.321518 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:17Z","lastTransitionTime":"2025-11-25T15:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.416370 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:17 crc kubenswrapper[4704]: E1125 15:36:17.416888 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.417005 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:17 crc kubenswrapper[4704]: E1125 15:36:17.417111 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.424503 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.424616 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.424682 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.424753 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.424879 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:17Z","lastTransitionTime":"2025-11-25T15:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.527769 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.527833 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.527842 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.527861 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.527870 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:17Z","lastTransitionTime":"2025-11-25T15:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.630591 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.630919 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.630994 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.631062 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.631123 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:17Z","lastTransitionTime":"2025-11-25T15:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.733276 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.733686 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.733771 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.733896 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.733982 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:17Z","lastTransitionTime":"2025-11-25T15:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.836162 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.836204 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.836213 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.836228 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.836243 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:17Z","lastTransitionTime":"2025-11-25T15:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.939028 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.939067 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.939079 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.939098 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:17 crc kubenswrapper[4704]: I1125 15:36:17.939113 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:17Z","lastTransitionTime":"2025-11-25T15:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.042096 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.042141 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.042154 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.042172 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.042186 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:18Z","lastTransitionTime":"2025-11-25T15:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.145084 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.145143 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.145157 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.145177 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.145232 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:18Z","lastTransitionTime":"2025-11-25T15:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.247666 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.247704 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.247712 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.247728 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.247739 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:18Z","lastTransitionTime":"2025-11-25T15:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.350810 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.350847 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.350857 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.350875 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.350887 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:18Z","lastTransitionTime":"2025-11-25T15:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.415674 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.415708 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:18 crc kubenswrapper[4704]: E1125 15:36:18.415879 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:18 crc kubenswrapper[4704]: E1125 15:36:18.416043 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.459186 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.459237 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.459248 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.459265 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.459277 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:18Z","lastTransitionTime":"2025-11-25T15:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.562644 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.562697 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.562710 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.562728 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.562740 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:18Z","lastTransitionTime":"2025-11-25T15:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.665487 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.665534 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.665547 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.665564 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.665577 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:18Z","lastTransitionTime":"2025-11-25T15:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.768153 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.768194 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.768206 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.768242 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.768254 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:18Z","lastTransitionTime":"2025-11-25T15:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.871030 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.871091 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.871104 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.871122 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.871133 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:18Z","lastTransitionTime":"2025-11-25T15:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.973228 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.973262 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.973271 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.973287 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:18 crc kubenswrapper[4704]: I1125 15:36:18.973297 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:18Z","lastTransitionTime":"2025-11-25T15:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.075895 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.075947 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.075966 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.075985 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.075998 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:19Z","lastTransitionTime":"2025-11-25T15:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.178850 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.178904 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.178915 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.178934 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.178945 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:19Z","lastTransitionTime":"2025-11-25T15:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.281273 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.281325 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.281337 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.281354 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.281365 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:19Z","lastTransitionTime":"2025-11-25T15:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.383912 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.383950 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.383958 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.383974 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.383984 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:19Z","lastTransitionTime":"2025-11-25T15:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.415363 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.415378 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:19 crc kubenswrapper[4704]: E1125 15:36:19.415534 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:36:19 crc kubenswrapper[4704]: E1125 15:36:19.415570 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.486659 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.487061 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.487159 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.487250 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.487349 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:19Z","lastTransitionTime":"2025-11-25T15:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.589926 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.589988 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.590002 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.590021 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.590033 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:19Z","lastTransitionTime":"2025-11-25T15:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.692443 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.692488 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.692497 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.692513 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.692523 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:19Z","lastTransitionTime":"2025-11-25T15:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.795936 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.795972 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.795982 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.795996 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.796007 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:19Z","lastTransitionTime":"2025-11-25T15:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.899127 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.899167 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.899180 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.899199 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:19 crc kubenswrapper[4704]: I1125 15:36:19.899212 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:19Z","lastTransitionTime":"2025-11-25T15:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.002382 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.002436 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.002450 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.002468 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.002485 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:20Z","lastTransitionTime":"2025-11-25T15:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.105613 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.105674 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.105685 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.105724 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.105740 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:20Z","lastTransitionTime":"2025-11-25T15:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.208762 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.208845 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.208858 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.208876 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.208908 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:20Z","lastTransitionTime":"2025-11-25T15:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.312057 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.312128 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.312140 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.312156 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.312169 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:20Z","lastTransitionTime":"2025-11-25T15:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.414908 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.414984 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.414995 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.415009 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.415019 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:20Z","lastTransitionTime":"2025-11-25T15:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.416020 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:20 crc kubenswrapper[4704]: E1125 15:36:20.416173 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.416413 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:20 crc kubenswrapper[4704]: E1125 15:36:20.416467 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.517749 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.517815 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.517827 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.517849 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.517862 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:20Z","lastTransitionTime":"2025-11-25T15:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.621033 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.621188 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.621201 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.621222 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.621241 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:20Z","lastTransitionTime":"2025-11-25T15:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.723655 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.723699 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.723709 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.723726 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.723736 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:20Z","lastTransitionTime":"2025-11-25T15:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.825452 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.825498 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.825508 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.825526 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.825540 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:20Z","lastTransitionTime":"2025-11-25T15:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.928338 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.928390 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.928402 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.928420 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:20 crc kubenswrapper[4704]: I1125 15:36:20.928431 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:20Z","lastTransitionTime":"2025-11-25T15:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.003188 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.003228 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.003238 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.003252 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.003263 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:21Z","lastTransitionTime":"2025-11-25T15:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:21 crc kubenswrapper[4704]: E1125 15:36:21.016483 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:21Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.020631 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.020663 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.020676 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.020695 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.020707 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:21Z","lastTransitionTime":"2025-11-25T15:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:21 crc kubenswrapper[4704]: E1125 15:36:21.033918 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:21Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.037651 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.037846 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.037927 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.037996 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.038066 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:21Z","lastTransitionTime":"2025-11-25T15:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:21 crc kubenswrapper[4704]: E1125 15:36:21.049455 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:21Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.053488 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.053517 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.053526 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.053543 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.053557 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:21Z","lastTransitionTime":"2025-11-25T15:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:21 crc kubenswrapper[4704]: E1125 15:36:21.065162 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:21Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.069157 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.069191 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.069200 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.069216 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.069226 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:21Z","lastTransitionTime":"2025-11-25T15:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:21 crc kubenswrapper[4704]: E1125 15:36:21.080935 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:21Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:21 crc kubenswrapper[4704]: E1125 15:36:21.081586 4704 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.083638 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.083664 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.083672 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.083686 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.083695 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:21Z","lastTransitionTime":"2025-11-25T15:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.186405 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.186452 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.186462 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.186478 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.186490 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:21Z","lastTransitionTime":"2025-11-25T15:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.288450 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.288503 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.288514 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.288530 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.288540 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:21Z","lastTransitionTime":"2025-11-25T15:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.392777 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.392860 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.392872 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.392888 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.392899 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:21Z","lastTransitionTime":"2025-11-25T15:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.416260 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.416281 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:21 crc kubenswrapper[4704]: E1125 15:36:21.416393 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:21 crc kubenswrapper[4704]: E1125 15:36:21.416575 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.495340 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.495682 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.495802 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.495898 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.495983 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:21Z","lastTransitionTime":"2025-11-25T15:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.598349 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.598388 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.598400 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.598417 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.598429 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:21Z","lastTransitionTime":"2025-11-25T15:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.701220 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.701744 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.701757 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.701777 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.701807 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:21Z","lastTransitionTime":"2025-11-25T15:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.804432 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.804470 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.804478 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.804493 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.804504 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:21Z","lastTransitionTime":"2025-11-25T15:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.906681 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.906746 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.906757 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.906779 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:21 crc kubenswrapper[4704]: I1125 15:36:21.906821 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:21Z","lastTransitionTime":"2025-11-25T15:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.009715 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.009810 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.009829 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.009852 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.009866 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:22Z","lastTransitionTime":"2025-11-25T15:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.112565 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.112633 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.112643 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.112661 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.112672 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:22Z","lastTransitionTime":"2025-11-25T15:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.215062 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.215121 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.215133 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.215150 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.215161 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:22Z","lastTransitionTime":"2025-11-25T15:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.317177 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.317226 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.317237 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.317254 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.317266 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:22Z","lastTransitionTime":"2025-11-25T15:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.416416 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.416499 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:22 crc kubenswrapper[4704]: E1125 15:36:22.416564 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:22 crc kubenswrapper[4704]: E1125 15:36:22.416668 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.420683 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.420731 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.420741 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.420758 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.420771 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:22Z","lastTransitionTime":"2025-11-25T15:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.427185 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.523023 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.523066 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.523078 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.523095 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.523109 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:22Z","lastTransitionTime":"2025-11-25T15:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.626359 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.626401 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.626412 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.626431 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.626443 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:22Z","lastTransitionTime":"2025-11-25T15:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.729465 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.729511 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.729522 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.729539 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.729551 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:22Z","lastTransitionTime":"2025-11-25T15:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.830997 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.831037 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.831048 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.831064 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.831075 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:22Z","lastTransitionTime":"2025-11-25T15:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.933491 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.933527 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.933537 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.933551 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:22 crc kubenswrapper[4704]: I1125 15:36:22.933560 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:22Z","lastTransitionTime":"2025-11-25T15:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.035967 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.036028 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.036039 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.036053 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.036066 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:23Z","lastTransitionTime":"2025-11-25T15:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.138122 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.138159 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.138167 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.138184 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.138193 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:23Z","lastTransitionTime":"2025-11-25T15:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.240748 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.240941 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.240958 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.240983 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.241002 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:23Z","lastTransitionTime":"2025-11-25T15:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.343155 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.343201 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.343209 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.343226 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.343238 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:23Z","lastTransitionTime":"2025-11-25T15:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.416071 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.416112 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:23 crc kubenswrapper[4704]: E1125 15:36:23.416231 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:23 crc kubenswrapper[4704]: E1125 15:36:23.416666 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.416963 4704 scope.go:117] "RemoveContainer" containerID="66e1aead4358c4e3bd9a2beb6aba48d99e5ab391a8e715a5a5eaa67f6f6bed8f" Nov 25 15:36:23 crc kubenswrapper[4704]: E1125 15:36:23.417127 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5kt46_openshift-ovn-kubernetes(f5274608-0c76-48d9-949d-53254df99b83)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" podUID="f5274608-0c76-48d9-949d-53254df99b83" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.446633 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.446697 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.446710 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.446731 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.446745 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:23Z","lastTransitionTime":"2025-11-25T15:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.549778 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.549856 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.549867 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.549884 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.549895 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:23Z","lastTransitionTime":"2025-11-25T15:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.653389 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.653448 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.653459 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.653476 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.653488 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:23Z","lastTransitionTime":"2025-11-25T15:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.757036 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.757091 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.757101 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.757120 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.757131 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:23Z","lastTransitionTime":"2025-11-25T15:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.860059 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.860100 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.860116 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.860133 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.860149 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:23Z","lastTransitionTime":"2025-11-25T15:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.962648 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.962692 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.962700 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.962717 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:23 crc kubenswrapper[4704]: I1125 15:36:23.962727 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:23Z","lastTransitionTime":"2025-11-25T15:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.065507 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.065570 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.065583 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.065603 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.065616 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:24Z","lastTransitionTime":"2025-11-25T15:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.152193 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-metrics-certs\") pod \"network-metrics-daemon-z6lnx\" (UID: \"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\") " pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:24 crc kubenswrapper[4704]: E1125 15:36:24.152402 4704 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:36:24 crc kubenswrapper[4704]: E1125 15:36:24.152475 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-metrics-certs podName:b9cf8fad-2f72-4a94-958b-dd58fc76f4df nodeName:}" failed. No retries permitted until 2025-11-25 15:36:56.152457545 +0000 UTC m=+102.420731326 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-metrics-certs") pod "network-metrics-daemon-z6lnx" (UID: "b9cf8fad-2f72-4a94-958b-dd58fc76f4df") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.169046 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.169104 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.169115 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.169143 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.169156 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:24Z","lastTransitionTime":"2025-11-25T15:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.272281 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.272344 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.272357 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.272379 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.272392 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:24Z","lastTransitionTime":"2025-11-25T15:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.375076 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.375135 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.375146 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.375162 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.375173 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:24Z","lastTransitionTime":"2025-11-25T15:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.415920 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:24 crc kubenswrapper[4704]: E1125 15:36:24.416102 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.416763 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:24 crc kubenswrapper[4704]: E1125 15:36:24.417029 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.436031 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:24Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.447468 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762ff0fd-9b25-4158-bd68-957bbfa4298c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8b8c4cc5291456d1da07123c3cb28928c03f31896d7ecc39be043bf8b8d9ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a86e85d42ee299ff07f19e664e89971cb63b7cc1edd398c1bedf638314ee482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0b95608507532a76441cdae944fe025fdaf833fd16e62ca0851043de8bb308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978a4be974ef4e8021721a05ea6dea880a799a5ec99b653504d862eb2f1c8c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978a4be974ef4e8021721a05ea6dea880a799a5ec99b653504d862eb2f1c8c74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:24Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.457717 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac8c9734-f4a0-4021-a36b-495183e0c4ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca92917da8a2f82963a21de252aca3b6ca15646ff0a30a07dfc3a0c25682691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcaac718777a480f97457edf8afd0a55e9199d31cf857614486e92b2897541e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dcaac718777a480f97457edf8afd0a55e9199d31cf857614486e92b2897541e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:24Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.474161 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:24Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.478412 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.478451 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.478462 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.478478 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.478500 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:24Z","lastTransitionTime":"2025-11-25T15:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.487139 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:24Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.498606 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:24Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.518173 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c377f1740ee06c0676a0f786ba5b15eac00cbeebf162a9ad465a505c3183a652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:24Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.530248 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:24Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.540366 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:24Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.553525 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:24Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.566985 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:24Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.581473 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.581514 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.581527 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.581546 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.581560 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:24Z","lastTransitionTime":"2025-11-25T15:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.582130 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:24Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.594602 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:24Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.603626 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801589f-7db3-4c55-9232-29b5417286d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b199609f9051ae77612ae12ecfe0c54c2cd7c008701075dc17cf4ff0f6d7b021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://313f477ea11a405cea82c897651b0950eb648d63a44b5292a774c5d943f21483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kct97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:24Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.612879 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6lnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6lnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:24Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.630052 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:24Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.642358 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:24Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.651747 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:24Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.666846 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66e1aead4358c4e3bd9a2beb6aba48d99e5ab391a8e715a5a5eaa67f6f6bed8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66e1aead4358c4e3bd9a2beb6aba48d99e5ab391a8e715a5a5eaa67f6f6bed8f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:36:09Z\\\",\\\"message\\\":\\\"onfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.222\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1125 15:36:09.236915 6377 services_controller.go:444] Built service openshift-authentication/oauth-openshift LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1125 15:36:09.235648 6377 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97 after 0 failed attempt(s)\\\\nI1125 15:36:09.236928 6377 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97\\\\nI1125 15:36:09.236928 6377 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1125 15:36:09.236939 6377 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1125 15:36:09.236944 6377 services_controller.go:445] Built service openshift-authentication/oauth-openshift LB template configs for network=default: []services.lbConfig(nil)\\\\nI1125 15:36:09.236959 6377 services_controller.go:451] Built service openshift-authentication/oauth-open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:36:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5kt46_openshift-ovn-kubernetes(f5274608-0c76-48d9-949d-53254df99b83)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:24Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.683585 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.683635 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.683646 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.683664 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.683674 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:24Z","lastTransitionTime":"2025-11-25T15:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.786086 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.786150 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.786162 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.786181 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.786193 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:24Z","lastTransitionTime":"2025-11-25T15:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.888798 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.888840 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.888850 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.888867 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.888877 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:24Z","lastTransitionTime":"2025-11-25T15:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.991099 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.991164 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.991178 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.991195 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:24 crc kubenswrapper[4704]: I1125 15:36:24.991207 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:24Z","lastTransitionTime":"2025-11-25T15:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.093748 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.093782 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.093814 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.093831 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.093841 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:25Z","lastTransitionTime":"2025-11-25T15:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.196401 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.196455 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.196466 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.196486 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.196499 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:25Z","lastTransitionTime":"2025-11-25T15:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.299499 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.299531 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.299539 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.299553 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.299562 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:25Z","lastTransitionTime":"2025-11-25T15:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.402082 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.402116 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.402124 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.402139 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.402148 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:25Z","lastTransitionTime":"2025-11-25T15:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.416122 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.416173 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:25 crc kubenswrapper[4704]: E1125 15:36:25.416281 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:36:25 crc kubenswrapper[4704]: E1125 15:36:25.416502 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.508162 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.508209 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.508220 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.508244 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.508254 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:25Z","lastTransitionTime":"2025-11-25T15:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.611837 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.611882 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.611892 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.611908 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.611920 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:25Z","lastTransitionTime":"2025-11-25T15:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.714765 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.715257 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.715276 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.715774 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.715828 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:25Z","lastTransitionTime":"2025-11-25T15:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.818975 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.819011 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.819021 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.819047 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.819058 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:25Z","lastTransitionTime":"2025-11-25T15:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.841740 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h92xm_d2820ade-e9bd-4146-b275-0c3b7d0cb5aa/kube-multus/0.log" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.841814 4704 generic.go:334] "Generic (PLEG): container finished" podID="d2820ade-e9bd-4146-b275-0c3b7d0cb5aa" containerID="d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59" exitCode=1 Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.841849 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h92xm" event={"ID":"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa","Type":"ContainerDied","Data":"d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59"} Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.842272 4704 scope.go:117] "RemoveContainer" containerID="d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.854680 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:25Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.870648 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:25Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.884041 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:25Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.902167 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:25Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.917158 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:25Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.922415 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.922450 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.922468 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.922486 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.922498 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:25Z","lastTransitionTime":"2025-11-25T15:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.935065 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:36:25Z\\\",\\\"message\\\":\\\"2025-11-25T15:35:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a1966ac7-4d16-411b-a7fa-fca7a20aa9b2\\\\n2025-11-25T15:35:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a1966ac7-4d16-411b-a7fa-fca7a20aa9b2 to /host/opt/cni/bin/\\\\n2025-11-25T15:35:40Z [verbose] multus-daemon started\\\\n2025-11-25T15:35:40Z [verbose] Readiness Indicator file check\\\\n2025-11-25T15:36:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:25Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.952152 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801589f-7db3-4c55-9232-29b5417286d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b199609f9051ae77612ae12ecfe0c54c2cd7c008701075dc17cf4ff0f6d7b021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://313f477ea11a405cea82c897651b0950eb648d63a44b5292a774c5d943f21483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kct97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:25Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.971959 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6lnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6lnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:25Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:25 crc kubenswrapper[4704]: I1125 15:36:25.993168 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:25Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.009116 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.025871 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.026215 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.026280 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.026295 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.026317 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.026331 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:26Z","lastTransitionTime":"2025-11-25T15:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.048454 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66e1aead4358c4e3bd9a2beb6aba48d99e5ab391a8e715a5a5eaa67f6f6bed8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66e1aead4358c4e3bd9a2beb6aba48d99e5ab391a8e715a5a5eaa67f6f6bed8f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:36:09Z\\\",\\\"message\\\":\\\"onfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.222\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1125 15:36:09.236915 6377 services_controller.go:444] Built service openshift-authentication/oauth-openshift LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1125 15:36:09.235648 6377 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97 after 0 failed attempt(s)\\\\nI1125 15:36:09.236928 6377 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97\\\\nI1125 15:36:09.236928 6377 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1125 15:36:09.236939 6377 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1125 15:36:09.236944 6377 services_controller.go:445] Built service openshift-authentication/oauth-openshift LB template configs for network=default: []services.lbConfig(nil)\\\\nI1125 15:36:09.236959 6377 services_controller.go:451] Built service openshift-authentication/oauth-open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:36:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5kt46_openshift-ovn-kubernetes(f5274608-0c76-48d9-949d-53254df99b83)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.062690 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c377f1740ee06c0676a0f786ba5b15eac00cbeebf162a9ad465a505c3183a652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.074330 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.085934 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762ff0fd-9b25-4158-bd68-957bbfa4298c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8b8c4cc5291456d1da07123c3cb28928c03f31896d7ecc39be043bf8b8d9ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a86e85d42ee299ff07f19e664e89971cb63b7cc1edd398c1bedf638314ee482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0b95608507532a76441cdae944fe025fdaf833fd16e62ca0851043de8bb308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978a4be974ef4e8021721a05ea6dea880a799a5ec99b653504d862eb2f1c8c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978a4be974ef4e8021721a05ea6dea880a799a5ec99b653504d862eb2f1c8c74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.096704 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac8c9734-f4a0-4021-a36b-495183e0c4ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca92917da8a2f82963a21de252aca3b6ca15646ff0a30a07dfc3a0c25682691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcaac718777a480f97457edf8afd0a55e9199d31cf857614486e92b2897541e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dcaac718777a480f97457edf8afd0a55e9199d31cf857614486e92b2897541e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.109875 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.121781 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.129063 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.129102 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.129112 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.129132 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.129143 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:26Z","lastTransitionTime":"2025-11-25T15:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.131480 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.231416 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.231463 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.231472 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.231489 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.231500 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:26Z","lastTransitionTime":"2025-11-25T15:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.334126 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.334169 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.334178 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.334193 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.334203 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:26Z","lastTransitionTime":"2025-11-25T15:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.416269 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.416284 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:26 crc kubenswrapper[4704]: E1125 15:36:26.416401 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:26 crc kubenswrapper[4704]: E1125 15:36:26.416496 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.436772 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.436847 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.436859 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.436878 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.436890 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:26Z","lastTransitionTime":"2025-11-25T15:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.539619 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.539671 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.539684 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.539704 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.539719 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:26Z","lastTransitionTime":"2025-11-25T15:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.642189 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.642239 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.642249 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.642267 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.642277 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:26Z","lastTransitionTime":"2025-11-25T15:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.744427 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.744473 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.744482 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.744502 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.744513 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:26Z","lastTransitionTime":"2025-11-25T15:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.850426 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.850537 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.850550 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.850568 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.850579 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:26Z","lastTransitionTime":"2025-11-25T15:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.851572 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h92xm_d2820ade-e9bd-4146-b275-0c3b7d0cb5aa/kube-multus/0.log" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.851630 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h92xm" event={"ID":"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa","Type":"ContainerStarted","Data":"89cb23cc625602134c0e14c76fa545386707fdb2815ffbc0563737edc6552ff0"} Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.865963 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.878197 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762ff0fd-9b25-4158-bd68-957bbfa4298c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8b8c4cc5291456d1da07123c3cb28928c03f31896d7ecc39be043bf8b8d9ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a86e85d42ee299ff07f19e664e89971cb63b7cc1edd398c1bedf638314ee482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0b95608507532a76441cdae944fe025fdaf833fd16e62ca0851043de8bb308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978a4be974ef4e8021721a05ea6dea880a799a5ec99b653504d862eb2f1c8c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978a4be974ef4e8021721a05ea6dea880a799a5ec99b653504d862eb2f1c8c74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.889434 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac8c9734-f4a0-4021-a36b-495183e0c4ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca92917da8a2f82963a21de252aca3b6ca15646ff0a30a07dfc3a0c25682691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcaac718777a480f97457edf8afd0a55e9199d31cf857614486e92b2897541e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dcaac718777a480f97457edf8afd0a55e9199d31cf857614486e92b2897541e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.904021 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.918354 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.931825 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.945463 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c377f1740ee06c0676a0f786ba5b15eac00cbeebf162a9ad465a505c3183a652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.953319 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.953356 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.953365 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.953384 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.953395 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:26Z","lastTransitionTime":"2025-11-25T15:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.957178 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.968394 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.982405 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:26 crc kubenswrapper[4704]: I1125 15:36:26.995494 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:26Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.006273 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.020874 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89cb23cc625602134c0e14c76fa545386707fdb2815ffbc0563737edc6552ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:36:25Z\\\",\\\"message\\\":\\\"2025-11-25T15:35:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a1966ac7-4d16-411b-a7fa-fca7a20aa9b2\\\\n2025-11-25T15:35:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a1966ac7-4d16-411b-a7fa-fca7a20aa9b2 to /host/opt/cni/bin/\\\\n2025-11-25T15:35:40Z [verbose] multus-daemon started\\\\n2025-11-25T15:35:40Z [verbose] Readiness Indicator file check\\\\n2025-11-25T15:36:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:36:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.032880 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801589f-7db3-4c55-9232-29b5417286d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b199609f9051ae77612ae12ecfe0c54c2cd7c008701075dc17cf4ff0f6d7b021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://313f477ea11a405cea82c897651b0950eb648d63a44b5292a774c5d943f21483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kct97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.043602 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6lnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6lnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.055912 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.055956 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.055968 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.055986 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.056003 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:27Z","lastTransitionTime":"2025-11-25T15:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.066367 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.079975 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.094457 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.114064 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66e1aead4358c4e3bd9a2beb6aba48d99e5ab391a8e715a5a5eaa67f6f6bed8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66e1aead4358c4e3bd9a2beb6aba48d99e5ab391a8e715a5a5eaa67f6f6bed8f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:36:09Z\\\",\\\"message\\\":\\\"onfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.222\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1125 15:36:09.236915 6377 services_controller.go:444] Built service openshift-authentication/oauth-openshift LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1125 15:36:09.235648 6377 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97 after 0 failed attempt(s)\\\\nI1125 15:36:09.236928 6377 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97\\\\nI1125 15:36:09.236928 6377 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1125 15:36:09.236939 6377 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1125 15:36:09.236944 6377 services_controller.go:445] Built service openshift-authentication/oauth-openshift LB template configs for network=default: []services.lbConfig(nil)\\\\nI1125 15:36:09.236959 6377 services_controller.go:451] Built service openshift-authentication/oauth-open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:36:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5kt46_openshift-ovn-kubernetes(f5274608-0c76-48d9-949d-53254df99b83)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:27Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.159022 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.159079 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.159094 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.159112 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.159128 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:27Z","lastTransitionTime":"2025-11-25T15:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.261673 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.261717 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.261729 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.261748 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.261762 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:27Z","lastTransitionTime":"2025-11-25T15:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.364569 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.364622 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.364633 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.364650 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.364664 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:27Z","lastTransitionTime":"2025-11-25T15:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.416015 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.416128 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:27 crc kubenswrapper[4704]: E1125 15:36:27.416235 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:36:27 crc kubenswrapper[4704]: E1125 15:36:27.416287 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.467133 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.467190 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.467202 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.467223 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.467238 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:27Z","lastTransitionTime":"2025-11-25T15:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.570236 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.570296 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.570311 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.570334 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.570347 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:27Z","lastTransitionTime":"2025-11-25T15:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.673744 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.673826 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.673844 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.673866 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.673880 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:27Z","lastTransitionTime":"2025-11-25T15:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.776969 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.777034 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.777047 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.777068 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.777085 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:27Z","lastTransitionTime":"2025-11-25T15:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.878997 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.879044 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.879056 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.879073 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.879104 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:27Z","lastTransitionTime":"2025-11-25T15:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.981535 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.981595 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.981611 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.981630 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:27 crc kubenswrapper[4704]: I1125 15:36:27.981645 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:27Z","lastTransitionTime":"2025-11-25T15:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.084120 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.084166 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.084176 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.084190 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.084200 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:28Z","lastTransitionTime":"2025-11-25T15:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.187496 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.187557 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.187570 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.187591 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.187605 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:28Z","lastTransitionTime":"2025-11-25T15:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.290441 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.290494 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.290505 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.290521 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.290534 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:28Z","lastTransitionTime":"2025-11-25T15:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.393704 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.393775 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.393812 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.393831 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.393843 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:28Z","lastTransitionTime":"2025-11-25T15:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.415419 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.415450 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:28 crc kubenswrapper[4704]: E1125 15:36:28.415576 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:28 crc kubenswrapper[4704]: E1125 15:36:28.415664 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.496745 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.496812 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.496826 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.496845 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.496858 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:28Z","lastTransitionTime":"2025-11-25T15:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.600398 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.600448 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.600457 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.600474 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.600508 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:28Z","lastTransitionTime":"2025-11-25T15:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.707159 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.707208 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.707219 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.707236 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.707246 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:28Z","lastTransitionTime":"2025-11-25T15:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.810150 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.810196 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.810207 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.810225 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.810237 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:28Z","lastTransitionTime":"2025-11-25T15:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.912970 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.913026 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.913037 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.913054 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:28 crc kubenswrapper[4704]: I1125 15:36:28.913065 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:28Z","lastTransitionTime":"2025-11-25T15:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.015569 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.015613 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.015624 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.015641 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.015651 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:29Z","lastTransitionTime":"2025-11-25T15:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.118521 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.118567 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.118576 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.118594 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.118606 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:29Z","lastTransitionTime":"2025-11-25T15:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.220606 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.220642 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.220656 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.220674 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.220684 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:29Z","lastTransitionTime":"2025-11-25T15:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.323328 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.323373 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.323384 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.323402 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.323414 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:29Z","lastTransitionTime":"2025-11-25T15:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.415360 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:29 crc kubenswrapper[4704]: E1125 15:36:29.415504 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.415944 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:29 crc kubenswrapper[4704]: E1125 15:36:29.416190 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.426366 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.426414 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.426424 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.426444 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.426457 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:29Z","lastTransitionTime":"2025-11-25T15:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.529478 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.529530 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.529541 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.529559 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.529573 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:29Z","lastTransitionTime":"2025-11-25T15:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.634182 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.634248 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.634261 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.634283 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.634296 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:29Z","lastTransitionTime":"2025-11-25T15:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.736594 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.736647 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.736661 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.736679 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.736691 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:29Z","lastTransitionTime":"2025-11-25T15:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.839594 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.839653 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.839664 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.839682 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.839707 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:29Z","lastTransitionTime":"2025-11-25T15:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.941869 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.941908 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.941917 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.941931 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:29 crc kubenswrapper[4704]: I1125 15:36:29.941944 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:29Z","lastTransitionTime":"2025-11-25T15:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.044978 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.045026 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.045035 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.045052 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.045062 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:30Z","lastTransitionTime":"2025-11-25T15:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.147906 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.147958 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.147970 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.147989 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.147999 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:30Z","lastTransitionTime":"2025-11-25T15:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.250661 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.250710 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.250721 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.250741 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.250755 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:30Z","lastTransitionTime":"2025-11-25T15:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.352904 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.352961 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.352989 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.353006 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.353015 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:30Z","lastTransitionTime":"2025-11-25T15:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.416018 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:30 crc kubenswrapper[4704]: E1125 15:36:30.416160 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.416279 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:30 crc kubenswrapper[4704]: E1125 15:36:30.416488 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.454926 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.455170 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.455295 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.455388 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.455576 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:30Z","lastTransitionTime":"2025-11-25T15:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.558191 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.558258 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.558279 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.558310 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.558325 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:30Z","lastTransitionTime":"2025-11-25T15:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.660435 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.660765 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.660845 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.660975 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.661040 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:30Z","lastTransitionTime":"2025-11-25T15:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.768842 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.768896 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.768907 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.768924 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.768938 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:30Z","lastTransitionTime":"2025-11-25T15:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.870463 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.870511 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.870526 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.870545 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.870557 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:30Z","lastTransitionTime":"2025-11-25T15:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.972835 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.972908 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.972918 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.972945 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:30 crc kubenswrapper[4704]: I1125 15:36:30.972955 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:30Z","lastTransitionTime":"2025-11-25T15:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.075166 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.075213 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.075226 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.075247 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.075260 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:31Z","lastTransitionTime":"2025-11-25T15:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.146691 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.147182 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.147263 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.147358 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.147515 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:31Z","lastTransitionTime":"2025-11-25T15:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:31 crc kubenswrapper[4704]: E1125 15:36:31.160819 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.164662 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.164690 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.164702 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.164717 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.164727 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:31Z","lastTransitionTime":"2025-11-25T15:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:31 crc kubenswrapper[4704]: E1125 15:36:31.185609 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.189541 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.189681 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.189828 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.189899 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.189957 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:31Z","lastTransitionTime":"2025-11-25T15:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:31 crc kubenswrapper[4704]: E1125 15:36:31.203672 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.206994 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.207116 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.207181 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.207263 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.207342 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:31Z","lastTransitionTime":"2025-11-25T15:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:31 crc kubenswrapper[4704]: E1125 15:36:31.238992 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.244166 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.244428 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.244522 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.244620 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.244706 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:31Z","lastTransitionTime":"2025-11-25T15:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:31 crc kubenswrapper[4704]: E1125 15:36:31.263657 4704 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc66017c-37c1-4b18-ad41-23da03d4564b\\\",\\\"systemUUID\\\":\\\"7a7c0a64-f7eb-4637-84e2-93500c0e5ef0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:31Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:31 crc kubenswrapper[4704]: E1125 15:36:31.264228 4704 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.266083 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.266144 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.266161 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.266180 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.266192 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:31Z","lastTransitionTime":"2025-11-25T15:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.368460 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.368495 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.368504 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.368519 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.368530 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:31Z","lastTransitionTime":"2025-11-25T15:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.415461 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.415554 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:31 crc kubenswrapper[4704]: E1125 15:36:31.415637 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:36:31 crc kubenswrapper[4704]: E1125 15:36:31.415839 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.470888 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.470982 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.470994 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.471015 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.471027 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:31Z","lastTransitionTime":"2025-11-25T15:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.574936 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.575292 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.575401 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.575489 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.575579 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:31Z","lastTransitionTime":"2025-11-25T15:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.678162 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.678535 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.678625 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.678702 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.678767 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:31Z","lastTransitionTime":"2025-11-25T15:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.781179 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.781216 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.781224 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.781241 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.781249 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:31Z","lastTransitionTime":"2025-11-25T15:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.883882 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.883959 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.883970 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.883988 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.884003 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:31Z","lastTransitionTime":"2025-11-25T15:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.986865 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.986910 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.986920 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.986937 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:31 crc kubenswrapper[4704]: I1125 15:36:31.986948 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:31Z","lastTransitionTime":"2025-11-25T15:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.090842 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.090890 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.090899 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.090916 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.090935 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:32Z","lastTransitionTime":"2025-11-25T15:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.193543 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.193579 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.193591 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.193610 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.193624 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:32Z","lastTransitionTime":"2025-11-25T15:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.295683 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.295726 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.295737 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.295753 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.295764 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:32Z","lastTransitionTime":"2025-11-25T15:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.398818 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.398881 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.398898 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.398917 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.398929 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:32Z","lastTransitionTime":"2025-11-25T15:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.416420 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.416497 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:32 crc kubenswrapper[4704]: E1125 15:36:32.416596 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:32 crc kubenswrapper[4704]: E1125 15:36:32.416701 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.500692 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.500729 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.500739 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.500754 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.500763 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:32Z","lastTransitionTime":"2025-11-25T15:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.602682 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.602749 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.602760 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.602778 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.602805 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:32Z","lastTransitionTime":"2025-11-25T15:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.705699 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.705756 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.705766 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.705819 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.705830 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:32Z","lastTransitionTime":"2025-11-25T15:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.808439 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.808485 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.808496 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.808514 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.808525 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:32Z","lastTransitionTime":"2025-11-25T15:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.911660 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.911709 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.911721 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.911738 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:32 crc kubenswrapper[4704]: I1125 15:36:32.911747 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:32Z","lastTransitionTime":"2025-11-25T15:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.014218 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.014261 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.014272 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.014290 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.014300 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:33Z","lastTransitionTime":"2025-11-25T15:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.116982 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.117022 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.117034 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.117051 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.117064 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:33Z","lastTransitionTime":"2025-11-25T15:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.219929 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.219976 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.219988 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.220005 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.220018 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:33Z","lastTransitionTime":"2025-11-25T15:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.322262 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.322317 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.322331 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.322350 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.322363 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:33Z","lastTransitionTime":"2025-11-25T15:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.415980 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.415991 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:33 crc kubenswrapper[4704]: E1125 15:36:33.416188 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:36:33 crc kubenswrapper[4704]: E1125 15:36:33.416258 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.424629 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.424664 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.424672 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.424686 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.424695 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:33Z","lastTransitionTime":"2025-11-25T15:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.528070 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.528121 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.528132 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.528150 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.528169 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:33Z","lastTransitionTime":"2025-11-25T15:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.631215 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.631251 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.631260 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.631275 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.631284 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:33Z","lastTransitionTime":"2025-11-25T15:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.734071 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.734106 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.734115 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.734131 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.734140 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:33Z","lastTransitionTime":"2025-11-25T15:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.836241 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.836284 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.836292 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.836308 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.836322 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:33Z","lastTransitionTime":"2025-11-25T15:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.938057 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.938097 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.938106 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.938123 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:33 crc kubenswrapper[4704]: I1125 15:36:33.938133 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:33Z","lastTransitionTime":"2025-11-25T15:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.040683 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.040716 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.040724 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.040740 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.040749 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:34Z","lastTransitionTime":"2025-11-25T15:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.143024 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.143057 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.143068 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.143086 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.143097 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:34Z","lastTransitionTime":"2025-11-25T15:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.245931 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.246380 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.246477 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.246617 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.246640 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:34Z","lastTransitionTime":"2025-11-25T15:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.349268 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.349314 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.349326 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.349343 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.349356 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:34Z","lastTransitionTime":"2025-11-25T15:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.416195 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.416251 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:34 crc kubenswrapper[4704]: E1125 15:36:34.416373 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:34 crc kubenswrapper[4704]: E1125 15:36:34.416552 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.437619 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66e1aead4358c4e3bd9a2beb6aba48d99e5ab391a8e715a5a5eaa67f6f6bed8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66e1aead4358c4e3bd9a2beb6aba48d99e5ab391a8e715a5a5eaa67f6f6bed8f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:36:09Z\\\",\\\"message\\\":\\\"onfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.222\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1125 15:36:09.236915 6377 services_controller.go:444] Built service openshift-authentication/oauth-openshift LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1125 15:36:09.235648 6377 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97 after 0 failed attempt(s)\\\\nI1125 15:36:09.236928 6377 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97\\\\nI1125 15:36:09.236928 6377 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1125 15:36:09.236939 6377 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1125 15:36:09.236944 6377 services_controller.go:445] Built service openshift-authentication/oauth-openshift LB template configs for network=default: []services.lbConfig(nil)\\\\nI1125 15:36:09.236959 6377 services_controller.go:451] Built service openshift-authentication/oauth-open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:36:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5kt46_openshift-ovn-kubernetes(f5274608-0c76-48d9-949d-53254df99b83)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:34Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.452903 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.453341 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.453436 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.453529 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.453604 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:34Z","lastTransitionTime":"2025-11-25T15:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.461898 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:34Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.478121 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:34Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.493488 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:34Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.510268 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:34Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.525495 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:34Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.541297 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:34Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.557005 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.557043 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.557053 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.557070 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.557080 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:34Z","lastTransitionTime":"2025-11-25T15:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.559807 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c377f1740ee06c0676a0f786ba5b15eac00cbeebf162a9ad465a505c3183a652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:34Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.573761 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:34Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.588522 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762ff0fd-9b25-4158-bd68-957bbfa4298c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8b8c4cc5291456d1da07123c3cb28928c03f31896d7ecc39be043bf8b8d9ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a86e85d42ee299ff07f19e664e89971cb63b7cc1edd398c1bedf638314ee482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0b95608507532a76441cdae944fe025fdaf833fd16e62ca0851043de8bb308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978a4be974ef4e8021721a05ea6dea880a799a5ec99b653504d862eb2f1c8c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978a4be974ef4e8021721a05ea6dea880a799a5ec99b653504d862eb2f1c8c74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:34Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.601399 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac8c9734-f4a0-4021-a36b-495183e0c4ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca92917da8a2f82963a21de252aca3b6ca15646ff0a30a07dfc3a0c25682691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcaac718777a480f97457edf8afd0a55e9199d31cf857614486e92b2897541e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dcaac718777a480f97457edf8afd0a55e9199d31cf857614486e92b2897541e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:34Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.615427 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:34Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.630365 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:34Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.644503 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89cb23cc625602134c0e14c76fa545386707fdb2815ffbc0563737edc6552ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:36:25Z\\\",\\\"message\\\":\\\"2025-11-25T15:35:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a1966ac7-4d16-411b-a7fa-fca7a20aa9b2\\\\n2025-11-25T15:35:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a1966ac7-4d16-411b-a7fa-fca7a20aa9b2 to /host/opt/cni/bin/\\\\n2025-11-25T15:35:40Z [verbose] multus-daemon started\\\\n2025-11-25T15:35:40Z [verbose] Readiness Indicator file check\\\\n2025-11-25T15:36:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:36:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:34Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.658677 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801589f-7db3-4c55-9232-29b5417286d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b199609f9051ae77612ae12ecfe0c54c2cd7c008701075dc17cf4ff0f6d7b021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://313f477ea11a405cea82c897651b0950eb648d63a44b5292a774c5d943f21483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kct97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:34Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.659678 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.659712 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.659720 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.659738 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.659749 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:34Z","lastTransitionTime":"2025-11-25T15:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.672658 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6lnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6lnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:34Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.689420 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:34Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.704728 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:34Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.719640 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:34Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.762106 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.762137 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.762147 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.762164 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.762175 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:34Z","lastTransitionTime":"2025-11-25T15:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.864080 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.864135 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.864148 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.864167 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.864179 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:34Z","lastTransitionTime":"2025-11-25T15:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.966783 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.966853 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.966863 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.966884 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:34 crc kubenswrapper[4704]: I1125 15:36:34.966903 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:34Z","lastTransitionTime":"2025-11-25T15:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.070207 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.070260 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.070275 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.070295 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.070308 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:35Z","lastTransitionTime":"2025-11-25T15:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.173655 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.173697 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.173706 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.173724 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.173734 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:35Z","lastTransitionTime":"2025-11-25T15:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.276814 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.276888 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.276899 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.276921 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.276934 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:35Z","lastTransitionTime":"2025-11-25T15:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.379259 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.379304 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.379312 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.379327 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.379338 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:35Z","lastTransitionTime":"2025-11-25T15:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.416619 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.416659 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:35 crc kubenswrapper[4704]: E1125 15:36:35.417103 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:36:35 crc kubenswrapper[4704]: E1125 15:36:35.417259 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.417445 4704 scope.go:117] "RemoveContainer" containerID="66e1aead4358c4e3bd9a2beb6aba48d99e5ab391a8e715a5a5eaa67f6f6bed8f" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.482406 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.482448 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.482457 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.482473 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.482482 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:35Z","lastTransitionTime":"2025-11-25T15:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.585620 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.585662 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.585673 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.585690 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.585713 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:35Z","lastTransitionTime":"2025-11-25T15:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.688283 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.688352 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.688367 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.688391 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.688411 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:35Z","lastTransitionTime":"2025-11-25T15:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.792305 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.792372 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.792386 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.792407 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.792419 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:35Z","lastTransitionTime":"2025-11-25T15:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.882543 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5kt46_f5274608-0c76-48d9-949d-53254df99b83/ovnkube-controller/2.log" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.885553 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" event={"ID":"f5274608-0c76-48d9-949d-53254df99b83","Type":"ContainerStarted","Data":"c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606"} Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.886061 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.895339 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.895387 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.895396 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.895413 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.895422 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:35Z","lastTransitionTime":"2025-11-25T15:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.908389 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:35Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.920538 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:35Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.933901 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:35Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.954319 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66e1aead4358c4e3bd9a2beb6aba48d99e5ab391a8e715a5a5eaa67f6f6bed8f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:36:09Z\\\",\\\"message\\\":\\\"onfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.222\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1125 15:36:09.236915 6377 services_controller.go:444] Built service openshift-authentication/oauth-openshift LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1125 15:36:09.235648 6377 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97 after 0 failed attempt(s)\\\\nI1125 15:36:09.236928 6377 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97\\\\nI1125 15:36:09.236928 6377 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1125 15:36:09.236939 6377 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1125 15:36:09.236944 6377 services_controller.go:445] Built service openshift-authentication/oauth-openshift LB template configs for network=default: []services.lbConfig(nil)\\\\nI1125 15:36:09.236959 6377 services_controller.go:451] Built service openshift-authentication/oauth-open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:36:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:36:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:35Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.966293 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:35Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.981350 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c377f1740ee06c0676a0f786ba5b15eac00cbeebf162a9ad465a505c3183a652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:35Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.995863 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:35Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.997924 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.997953 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.997964 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.997983 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:35 crc kubenswrapper[4704]: I1125 15:36:35.997997 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:35Z","lastTransitionTime":"2025-11-25T15:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.010854 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762ff0fd-9b25-4158-bd68-957bbfa4298c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8b8c4cc5291456d1da07123c3cb28928c03f31896d7ecc39be043bf8b8d9ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a86e85d42ee299ff07f19e664e89971cb63b7cc1edd398c1bedf638314ee482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0b95608507532a76441cdae944fe025fdaf833fd16e62ca0851043de8bb308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978a4be974ef4e8021721a05ea6dea880a799a5ec99b653504d862eb2f1c8c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978a4be974ef4e8021721a05ea6dea880a799a5ec99b653504d862eb2f1c8c74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.023907 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac8c9734-f4a0-4021-a36b-495183e0c4ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca92917da8a2f82963a21de252aca3b6ca15646ff0a30a07dfc3a0c25682691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcaac718777a480f97457edf8afd0a55e9199d31cf857614486e92b2897541e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dcaac718777a480f97457edf8afd0a55e9199d31cf857614486e92b2897541e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.041583 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.054120 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.071619 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.085701 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.100010 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.100045 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.100055 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.100069 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.100079 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:36Z","lastTransitionTime":"2025-11-25T15:36:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.100119 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6lnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6lnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.122179 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.140283 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.157921 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.176663 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89cb23cc625602134c0e14c76fa545386707fdb2815ffbc0563737edc6552ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:36:25Z\\\",\\\"message\\\":\\\"2025-11-25T15:35:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a1966ac7-4d16-411b-a7fa-fca7a20aa9b2\\\\n2025-11-25T15:35:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a1966ac7-4d16-411b-a7fa-fca7a20aa9b2 to /host/opt/cni/bin/\\\\n2025-11-25T15:35:40Z [verbose] multus-daemon started\\\\n2025-11-25T15:35:40Z [verbose] Readiness Indicator file check\\\\n2025-11-25T15:36:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:36:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.189271 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801589f-7db3-4c55-9232-29b5417286d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b199609f9051ae77612ae12ecfe0c54c2cd7c008701075dc17cf4ff0f6d7b021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://313f477ea11a405cea82c897651b0950eb648d63a44b5292a774c5d943f21483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kct97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.202477 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.202543 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.202556 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.202576 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.202593 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:36Z","lastTransitionTime":"2025-11-25T15:36:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.306722 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.306783 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.306830 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.306848 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.306858 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:36Z","lastTransitionTime":"2025-11-25T15:36:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.410130 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.410199 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.410212 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.410235 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.410250 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:36Z","lastTransitionTime":"2025-11-25T15:36:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.415545 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.415545 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:36 crc kubenswrapper[4704]: E1125 15:36:36.415682 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:36 crc kubenswrapper[4704]: E1125 15:36:36.415748 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.512549 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.512603 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.512612 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.512629 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.512640 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:36Z","lastTransitionTime":"2025-11-25T15:36:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.616782 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.616869 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.616882 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.616903 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.616916 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:36Z","lastTransitionTime":"2025-11-25T15:36:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.719269 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.719624 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.719697 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.719775 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.719901 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:36Z","lastTransitionTime":"2025-11-25T15:36:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.822432 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.822512 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.822526 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.822543 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.822555 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:36Z","lastTransitionTime":"2025-11-25T15:36:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.891896 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5kt46_f5274608-0c76-48d9-949d-53254df99b83/ovnkube-controller/3.log" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.892513 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5kt46_f5274608-0c76-48d9-949d-53254df99b83/ovnkube-controller/2.log" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.895572 4704 generic.go:334] "Generic (PLEG): container finished" podID="f5274608-0c76-48d9-949d-53254df99b83" containerID="c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606" exitCode=1 Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.895620 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" event={"ID":"f5274608-0c76-48d9-949d-53254df99b83","Type":"ContainerDied","Data":"c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606"} Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.895916 4704 scope.go:117] "RemoveContainer" containerID="66e1aead4358c4e3bd9a2beb6aba48d99e5ab391a8e715a5a5eaa67f6f6bed8f" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.896283 4704 scope.go:117] "RemoveContainer" containerID="c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606" Nov 25 15:36:36 crc kubenswrapper[4704]: E1125 15:36:36.896479 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5kt46_openshift-ovn-kubernetes(f5274608-0c76-48d9-949d-53254df99b83)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" podUID="f5274608-0c76-48d9-949d-53254df99b83" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.919156 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.924712 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.924750 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.924761 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.924779 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.924809 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:36Z","lastTransitionTime":"2025-11-25T15:36:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.933017 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.947874 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.970397 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66e1aead4358c4e3bd9a2beb6aba48d99e5ab391a8e715a5a5eaa67f6f6bed8f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:36:09Z\\\",\\\"message\\\":\\\"onfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.222\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1125 15:36:09.236915 6377 services_controller.go:444] Built service openshift-authentication/oauth-openshift LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1125 15:36:09.235648 6377 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97 after 0 failed attempt(s)\\\\nI1125 15:36:09.236928 6377 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97\\\\nI1125 15:36:09.236928 6377 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1125 15:36:09.236939 6377 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1125 15:36:09.236944 6377 services_controller.go:445] Built service openshift-authentication/oauth-openshift LB template configs for network=default: []services.lbConfig(nil)\\\\nI1125 15:36:09.236959 6377 services_controller.go:451] Built service openshift-authentication/oauth-open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:36:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:36:36Z\\\",\\\"message\\\":\\\"s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.174\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1125 15:36:36.535927 6765 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:36Z is after 2025-08-24T17:21:41Z]\\\\nI1125 15:36:36.535826 6765 model_client.go:398] Mutate operations\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:36:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:36 crc kubenswrapper[4704]: I1125 15:36:36.986043 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:36Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.003859 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c377f1740ee06c0676a0f786ba5b15eac00cbeebf162a9ad465a505c3183a652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.020881 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.027115 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.027161 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.027197 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.027221 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.027236 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:37Z","lastTransitionTime":"2025-11-25T15:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.035860 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762ff0fd-9b25-4158-bd68-957bbfa4298c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8b8c4cc5291456d1da07123c3cb28928c03f31896d7ecc39be043bf8b8d9ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a86e85d42ee299ff07f19e664e89971cb63b7cc1edd398c1bedf638314ee482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0b95608507532a76441cdae944fe025fdaf833fd16e62ca0851043de8bb308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978a4be974ef4e8021721a05ea6dea880a799a5ec99b653504d862eb2f1c8c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978a4be974ef4e8021721a05ea6dea880a799a5ec99b653504d862eb2f1c8c74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.049340 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac8c9734-f4a0-4021-a36b-495183e0c4ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca92917da8a2f82963a21de252aca3b6ca15646ff0a30a07dfc3a0c25682691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcaac718777a480f97457edf8afd0a55e9199d31cf857614486e92b2897541e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dcaac718777a480f97457edf8afd0a55e9199d31cf857614486e92b2897541e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.065868 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.082437 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.096201 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.110444 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.123598 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6lnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6lnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.129265 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.129306 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.129319 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.129357 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.129402 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:37Z","lastTransitionTime":"2025-11-25T15:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.138251 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.153456 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.166626 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.181493 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89cb23cc625602134c0e14c76fa545386707fdb2815ffbc0563737edc6552ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:36:25Z\\\",\\\"message\\\":\\\"2025-11-25T15:35:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a1966ac7-4d16-411b-a7fa-fca7a20aa9b2\\\\n2025-11-25T15:35:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a1966ac7-4d16-411b-a7fa-fca7a20aa9b2 to /host/opt/cni/bin/\\\\n2025-11-25T15:35:40Z [verbose] multus-daemon started\\\\n2025-11-25T15:35:40Z [verbose] Readiness Indicator file check\\\\n2025-11-25T15:36:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:36:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.192166 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801589f-7db3-4c55-9232-29b5417286d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b199609f9051ae77612ae12ecfe0c54c2cd7c008701075dc17cf4ff0f6d7b021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://313f477ea11a405cea82c897651b0950eb648d63a44b5292a774c5d943f21483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kct97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.231661 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.231705 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.231718 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.231736 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.231759 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:37Z","lastTransitionTime":"2025-11-25T15:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.334550 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.334694 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.334712 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.334730 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.334742 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:37Z","lastTransitionTime":"2025-11-25T15:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.415760 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.415816 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:37 crc kubenswrapper[4704]: E1125 15:36:37.415932 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:36:37 crc kubenswrapper[4704]: E1125 15:36:37.416146 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.438019 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.438062 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.438074 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.438094 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.438108 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:37Z","lastTransitionTime":"2025-11-25T15:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.541277 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.541350 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.541364 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.541383 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.541395 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:37Z","lastTransitionTime":"2025-11-25T15:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.643911 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.643969 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.643981 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.644003 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.644015 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:37Z","lastTransitionTime":"2025-11-25T15:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.746772 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.746856 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.746866 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.746885 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.746896 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:37Z","lastTransitionTime":"2025-11-25T15:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.848932 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.848978 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.848986 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.849002 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.849011 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:37Z","lastTransitionTime":"2025-11-25T15:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.900424 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5kt46_f5274608-0c76-48d9-949d-53254df99b83/ovnkube-controller/3.log" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.904318 4704 scope.go:117] "RemoveContainer" containerID="c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606" Nov 25 15:36:37 crc kubenswrapper[4704]: E1125 15:36:37.904507 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5kt46_openshift-ovn-kubernetes(f5274608-0c76-48d9-949d-53254df99b83)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" podUID="f5274608-0c76-48d9-949d-53254df99b83" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.926450 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de30d049-ffb8-49b5-93bb-8c40f4d83e24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f550b44afb728fdc79303166f485c96c227c3fe74c33269765508461520f243a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791a9f4ecc158fbfe0f826de27781d50401c827849787c3d76b20d3bb760a32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6a8b2c845fdbdc3836b7babbc1bd7b73c03642ab9e627bb2e97920b1218b707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15668c6c116b6d1443c850db1bf90be9cb347ef1a7c4b3ea05ac02864ebc5ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c433bf5f12c7f4c6747887ff61237f29b9b36fa8065bc14e5af7eaaab1e4384\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc6ea7fecf506ec62b4aa87b0783ddfdd6f9163022699819fc13e8b139cb3aa6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2be274ecb3ee39bbe6c2961b5191d51046e5c17607f303a2f13a33654c781ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f0f1792357178171de20a3ae52d5df123550a70034350014a5efe2c375ab3fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.939460 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63cccc0b0cc95edd9ab6ca0bc0feae21883c7139ff8b8a62b3e8d4319dd9ac71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff7f5c6bbb4227f498d3756db74345ed28c26a15435ebacdccf21331ae75e1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.950825 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.951898 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.952007 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.952022 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.952047 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.952062 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:37Z","lastTransitionTime":"2025-11-25T15:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.970571 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5274608-0c76-48d9-949d-53254df99b83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:36:36Z\\\",\\\"message\\\":\\\"s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.174\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1125 15:36:36.535927 6765 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:36Z is after 2025-08-24T17:21:41Z]\\\\nI1125 15:36:36.535826 6765 model_client.go:398] Mutate operations\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:36:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5kt46_openshift-ovn-kubernetes(f5274608-0c76-48d9-949d-53254df99b83)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cw5d9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5kt46\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.984579 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l5scf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f430d4f1-803f-4dc6-a319-4e0b8836cf1e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c377f1740ee06c0676a0f786ba5b15eac00cbeebf162a9ad465a505c3183a652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://159b2864b24ae605ac16434233aa9689aa95b8c53d2ea527b59004443334982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8df62678362ea5599c0ed72f658abc17f56db43a38b18291c9b50a7165adad0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://535ca6d844173308a39a8fab8789c214d6442045b5694c5c50a331170b64943b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22a3e1f139ca990c38d3e4bd90416967f6c66ea87ee94c6598645e3571e67ddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c876ed679706be79593022a806c62b410d8dad6bc210285ac217d1c67ad1c7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490002620c5c64737c9d0efecb91b3941fa718262d9f652d65697415dd404469\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dk7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l5scf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:37 crc kubenswrapper[4704]: I1125 15:36:37.996230 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dc56788-7e85-4387-a999-27b30af08433\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44ff8dfe9fda67111b51f800c0a8055bd6700392d156ee07d33bba0ceacd6b9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dc4bf03ff63fb84de4babc55a207636a22dd6fae32bdae5c4a181777888b909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c864d7463622f08a15cc957fa0d9467e5d3c83d09f27ae55f599dee45038f02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6fd19211723649051d2784f57747f013fbc313fa2dcd79902732beb7eac4c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:37Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.007434 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762ff0fd-9b25-4158-bd68-957bbfa4298c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf8b8c4cc5291456d1da07123c3cb28928c03f31896d7ecc39be043bf8b8d9ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a86e85d42ee299ff07f19e664e89971cb63b7cc1edd398c1bedf638314ee482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea0b95608507532a76441cdae944fe025fdaf833fd16e62ca0851043de8bb308\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://978a4be974ef4e8021721a05ea6dea880a799a5ec99b653504d862eb2f1c8c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://978a4be974ef4e8021721a05ea6dea880a799a5ec99b653504d862eb2f1c8c74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.017521 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac8c9734-f4a0-4021-a36b-495183e0c4ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ca92917da8a2f82963a21de252aca3b6ca15646ff0a30a07dfc3a0c25682691\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dcaac718777a480f97457edf8afd0a55e9199d31cf857614486e92b2897541e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dcaac718777a480f97457edf8afd0a55e9199d31cf857614486e92b2897541e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.029277 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8070f94f-2302-4fae-9524-6d938a6b22f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebb061901d3c2ac32b002496e82b1f565c82724db92a138be7003f807f7fa120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://383356a02190e237c66890ef24c59fcc5405961220a441593e9a2766d388408b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e64c11c621a8889f76f7ac2d8c9a5ffa478279aac97e6812c20b6cacc5c870e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5889c378fc838f7efbce9276761891ab463eb76bf5b5e75d49c7755fed333743\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dfb8fc8ae17f45534eecaf42dbde548405c37ebad6b8136a697ea9fcc675091\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T15:35:28Z\\\",\\\"message\\\":\\\"W1125 15:35:17.769442 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1125 15:35:17.769810 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764084917 cert, and key in /tmp/serving-cert-3465234532/serving-signer.crt, /tmp/serving-cert-3465234532/serving-signer.key\\\\nI1125 15:35:18.129157 1 observer_polling.go:159] Starting file observer\\\\nW1125 15:35:18.131840 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1125 15:35:18.132014 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 15:35:18.133566 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3465234532/tls.crt::/tmp/serving-cert-3465234532/tls.key\\\\\\\"\\\\nF1125 15:35:28.392081 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8198076a86560c27c631bec8ed82e2d203e522d4d869405d19642f10e450fd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e52ed70ffeb668a5bf6329517ec74c91465dab4a5e7e45c786da56121dec1d28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T15:35:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.042494 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.053263 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"91b52682-d008-4b8a-8bc3-26b032d7dc2c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78552b24e943f068a6bbd5468a2d3393e2262881c6a8bcd4662042e7420cde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jfvv9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-djz8x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.055446 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.055481 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.055496 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.055514 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.055526 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:38Z","lastTransitionTime":"2025-11-25T15:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.063483 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8hrtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"709e7197-d9e5-4981-b4a3-249323be907c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc36cfee76820967056d5255ecf16e2d76f5c1226995f7f56c9c30b02cba930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xswmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8hrtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.075317 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9fq7f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55051ec6-6c32-4004-8f1a-3069433c80cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1be9c64f4d80661fc3d72ebaafaa66322bcd295b5f52abf1c4e451b027580f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t9r8k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9fq7f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.088473 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c2d9a5ea62675f195eb9ca6b34fc65522bde47636b00c04dc72636ec0797df7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.100284 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.112092 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://373493ba84d47662d4595e94790fc15db6f756005e2d2cccc415c73dcda39c72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.125424 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h92xm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:36:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89cb23cc625602134c0e14c76fa545386707fdb2815ffbc0563737edc6552ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T15:36:25Z\\\",\\\"message\\\":\\\"2025-11-25T15:35:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a1966ac7-4d16-411b-a7fa-fca7a20aa9b2\\\\n2025-11-25T15:35:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a1966ac7-4d16-411b-a7fa-fca7a20aa9b2 to /host/opt/cni/bin/\\\\n2025-11-25T15:35:40Z [verbose] multus-daemon started\\\\n2025-11-25T15:35:40Z [verbose] Readiness Indicator file check\\\\n2025-11-25T15:36:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T15:35:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:36:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5x8mj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h92xm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.138752 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801589f-7db3-4c55-9232-29b5417286d2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b199609f9051ae77612ae12ecfe0c54c2cd7c008701075dc17cf4ff0f6d7b021\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://313f477ea11a405cea82c897651b0950eb648d63a44b5292a774c5d943f21483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T15:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j29nl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kct97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.151464 4704 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z6lnx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T15:35:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wbgz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T15:35:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z6lnx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T15:36:38Z is after 2025-08-24T17:21:41Z" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.158418 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.158479 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.158491 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.158506 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.158516 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:38Z","lastTransitionTime":"2025-11-25T15:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.210318 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.210405 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.210473 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:38 crc kubenswrapper[4704]: E1125 15:36:38.210550 4704 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:36:38 crc kubenswrapper[4704]: E1125 15:36:38.210589 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:42.210556208 +0000 UTC m=+148.478829989 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:36:38 crc kubenswrapper[4704]: E1125 15:36:38.210655 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:37:42.21064593 +0000 UTC m=+148.478919931 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 15:36:38 crc kubenswrapper[4704]: E1125 15:36:38.210696 4704 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:36:38 crc kubenswrapper[4704]: E1125 15:36:38.210817 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 15:37:42.210775704 +0000 UTC m=+148.479049665 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.260985 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.261034 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.261045 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.261066 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.261079 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:38Z","lastTransitionTime":"2025-11-25T15:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.311671 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.311735 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:38 crc kubenswrapper[4704]: E1125 15:36:38.311959 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:36:38 crc kubenswrapper[4704]: E1125 15:36:38.311967 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 15:36:38 crc kubenswrapper[4704]: E1125 15:36:38.312028 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:36:38 crc kubenswrapper[4704]: E1125 15:36:38.312046 4704 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:36:38 crc kubenswrapper[4704]: E1125 15:36:38.312102 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 15:37:42.312085127 +0000 UTC m=+148.580358908 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:36:38 crc kubenswrapper[4704]: E1125 15:36:38.311987 4704 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 15:36:38 crc kubenswrapper[4704]: E1125 15:36:38.312156 4704 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:36:38 crc kubenswrapper[4704]: E1125 15:36:38.312211 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 15:37:42.31219379 +0000 UTC m=+148.580467571 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.363643 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.363709 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.363721 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.363740 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.363753 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:38Z","lastTransitionTime":"2025-11-25T15:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.415740 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.415826 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:38 crc kubenswrapper[4704]: E1125 15:36:38.415952 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:38 crc kubenswrapper[4704]: E1125 15:36:38.416361 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.466895 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.466959 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.466974 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.466996 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.467010 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:38Z","lastTransitionTime":"2025-11-25T15:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.569478 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.569533 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.569546 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.569570 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.569586 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:38Z","lastTransitionTime":"2025-11-25T15:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.671857 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.671894 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.671904 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.671918 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.671929 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:38Z","lastTransitionTime":"2025-11-25T15:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.776189 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.776257 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.776269 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.776288 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.776300 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:38Z","lastTransitionTime":"2025-11-25T15:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.879021 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.879075 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.879086 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.879101 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.879111 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:38Z","lastTransitionTime":"2025-11-25T15:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.981479 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.981532 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.981544 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.981563 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:38 crc kubenswrapper[4704]: I1125 15:36:38.981577 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:38Z","lastTransitionTime":"2025-11-25T15:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.084453 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.084506 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.084516 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.084534 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.084547 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:39Z","lastTransitionTime":"2025-11-25T15:36:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.187000 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.187039 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.187049 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.187065 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.187077 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:39Z","lastTransitionTime":"2025-11-25T15:36:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.290148 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.290207 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.290225 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.290243 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.290258 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:39Z","lastTransitionTime":"2025-11-25T15:36:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.392761 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.392846 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.392857 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.392879 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.392892 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:39Z","lastTransitionTime":"2025-11-25T15:36:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.415979 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.416064 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:39 crc kubenswrapper[4704]: E1125 15:36:39.416286 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:39 crc kubenswrapper[4704]: E1125 15:36:39.416521 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.495530 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.495625 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.495646 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.495665 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.495677 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:39Z","lastTransitionTime":"2025-11-25T15:36:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.598908 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.598975 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.598987 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.599006 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.599018 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:39Z","lastTransitionTime":"2025-11-25T15:36:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.701665 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.701728 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.701738 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.701752 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.701762 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:39Z","lastTransitionTime":"2025-11-25T15:36:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.804892 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.804938 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.804953 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.804971 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.804982 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:39Z","lastTransitionTime":"2025-11-25T15:36:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.907740 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.907823 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.907838 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.907855 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:39 crc kubenswrapper[4704]: I1125 15:36:39.907867 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:39Z","lastTransitionTime":"2025-11-25T15:36:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.010873 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.011265 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.011283 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.011302 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.011313 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:40Z","lastTransitionTime":"2025-11-25T15:36:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.114106 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.114148 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.114158 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.114175 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.114187 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:40Z","lastTransitionTime":"2025-11-25T15:36:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.217366 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.217408 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.217418 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.217433 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.217442 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:40Z","lastTransitionTime":"2025-11-25T15:36:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.319833 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.319900 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.319909 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.319925 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.319934 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:40Z","lastTransitionTime":"2025-11-25T15:36:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.415931 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.416120 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:40 crc kubenswrapper[4704]: E1125 15:36:40.416241 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:40 crc kubenswrapper[4704]: E1125 15:36:40.416358 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.422066 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.422136 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.422148 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.422166 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.422177 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:40Z","lastTransitionTime":"2025-11-25T15:36:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.525702 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.525846 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.525861 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.525938 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.526026 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:40Z","lastTransitionTime":"2025-11-25T15:36:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.629250 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.629311 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.629331 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.629353 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.629367 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:40Z","lastTransitionTime":"2025-11-25T15:36:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.732390 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.732462 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.732475 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.732490 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.732503 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:40Z","lastTransitionTime":"2025-11-25T15:36:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.835606 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.835824 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.835839 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.835858 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.835873 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:40Z","lastTransitionTime":"2025-11-25T15:36:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.939183 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.939230 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.939239 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.939255 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:40 crc kubenswrapper[4704]: I1125 15:36:40.939265 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:40Z","lastTransitionTime":"2025-11-25T15:36:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.041343 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.041864 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.041874 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.041888 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.041897 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:41Z","lastTransitionTime":"2025-11-25T15:36:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.144740 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.144813 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.144824 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.144839 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.144850 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:41Z","lastTransitionTime":"2025-11-25T15:36:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.247691 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.247727 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.247736 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.247751 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.247762 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:41Z","lastTransitionTime":"2025-11-25T15:36:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.320819 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.320857 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.320870 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.320890 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.320903 4704 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T15:36:41Z","lastTransitionTime":"2025-11-25T15:36:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.375473 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-frnxk"] Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.375954 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-frnxk" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.377947 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.378117 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.378236 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.380258 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.402314 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9fq7f" podStartSLOduration=64.402295408 podStartE2EDuration="1m4.402295408s" podCreationTimestamp="2025-11-25 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:36:41.389684901 +0000 UTC m=+87.657958682" watchObservedRunningTime="2025-11-25 15:36:41.402295408 +0000 UTC m=+87.670569189" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.413183 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8hrtj" podStartSLOduration=64.413162765 podStartE2EDuration="1m4.413162765s" podCreationTimestamp="2025-11-25 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:36:41.402174995 +0000 UTC m=+87.670448786" watchObservedRunningTime="2025-11-25 15:36:41.413162765 +0000 UTC m=+87.681436546" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.416856 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:41 crc kubenswrapper[4704]: E1125 15:36:41.417029 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.417238 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:41 crc kubenswrapper[4704]: E1125 15:36:41.417913 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.441744 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-h92xm" podStartSLOduration=64.441724807 podStartE2EDuration="1m4.441724807s" podCreationTimestamp="2025-11-25 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:36:41.44146274 +0000 UTC m=+87.709736541" watchObservedRunningTime="2025-11-25 15:36:41.441724807 +0000 UTC m=+87.709998578" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.456498 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kct97" podStartSLOduration=63.456479507 podStartE2EDuration="1m3.456479507s" podCreationTimestamp="2025-11-25 15:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:36:41.456179639 +0000 UTC m=+87.724453440" watchObservedRunningTime="2025-11-25 15:36:41.456479507 +0000 UTC m=+87.724753288" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.543926 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5cb86e02-0e22-4790-a22d-cf5be6b0c3b3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-frnxk\" (UID: \"5cb86e02-0e22-4790-a22d-cf5be6b0c3b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-frnxk" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.543971 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5cb86e02-0e22-4790-a22d-cf5be6b0c3b3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-frnxk\" (UID: \"5cb86e02-0e22-4790-a22d-cf5be6b0c3b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-frnxk" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.544038 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cb86e02-0e22-4790-a22d-cf5be6b0c3b3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-frnxk\" (UID: \"5cb86e02-0e22-4790-a22d-cf5be6b0c3b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-frnxk" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.544062 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5cb86e02-0e22-4790-a22d-cf5be6b0c3b3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-frnxk\" (UID: \"5cb86e02-0e22-4790-a22d-cf5be6b0c3b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-frnxk" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.544094 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5cb86e02-0e22-4790-a22d-cf5be6b0c3b3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-frnxk\" (UID: \"5cb86e02-0e22-4790-a22d-cf5be6b0c3b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-frnxk" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.573860 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=65.573839798 podStartE2EDuration="1m5.573839798s" podCreationTimestamp="2025-11-25 15:35:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:36:41.573530759 +0000 UTC m=+87.841804550" watchObservedRunningTime="2025-11-25 15:36:41.573839798 +0000 UTC m=+87.842113589" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.586958 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=32.586938159 podStartE2EDuration="32.586938159s" podCreationTimestamp="2025-11-25 15:36:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:36:41.585529028 +0000 UTC m=+87.853802819" watchObservedRunningTime="2025-11-25 15:36:41.586938159 +0000 UTC m=+87.855211940" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.596170 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=19.596147128 podStartE2EDuration="19.596147128s" podCreationTimestamp="2025-11-25 15:36:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:36:41.595903191 +0000 UTC m=+87.864176972" watchObservedRunningTime="2025-11-25 15:36:41.596147128 +0000 UTC m=+87.864420919" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.611764 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=68.611744722 podStartE2EDuration="1m8.611744722s" podCreationTimestamp="2025-11-25 15:35:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:36:41.611013531 +0000 UTC m=+87.879287312" watchObservedRunningTime="2025-11-25 15:36:41.611744722 +0000 UTC m=+87.880018533" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.636944 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podStartSLOduration=64.636926786 podStartE2EDuration="1m4.636926786s" podCreationTimestamp="2025-11-25 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:36:41.636238236 +0000 UTC m=+87.904512037" watchObservedRunningTime="2025-11-25 15:36:41.636926786 +0000 UTC m=+87.905200567" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.644531 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5cb86e02-0e22-4790-a22d-cf5be6b0c3b3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-frnxk\" (UID: \"5cb86e02-0e22-4790-a22d-cf5be6b0c3b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-frnxk" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.644575 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5cb86e02-0e22-4790-a22d-cf5be6b0c3b3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-frnxk\" (UID: \"5cb86e02-0e22-4790-a22d-cf5be6b0c3b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-frnxk" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.644611 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cb86e02-0e22-4790-a22d-cf5be6b0c3b3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-frnxk\" (UID: \"5cb86e02-0e22-4790-a22d-cf5be6b0c3b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-frnxk" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.644629 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5cb86e02-0e22-4790-a22d-cf5be6b0c3b3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-frnxk\" (UID: \"5cb86e02-0e22-4790-a22d-cf5be6b0c3b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-frnxk" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.644654 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5cb86e02-0e22-4790-a22d-cf5be6b0c3b3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-frnxk\" (UID: \"5cb86e02-0e22-4790-a22d-cf5be6b0c3b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-frnxk" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.644719 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5cb86e02-0e22-4790-a22d-cf5be6b0c3b3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-frnxk\" (UID: \"5cb86e02-0e22-4790-a22d-cf5be6b0c3b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-frnxk" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.644726 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5cb86e02-0e22-4790-a22d-cf5be6b0c3b3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-frnxk\" (UID: \"5cb86e02-0e22-4790-a22d-cf5be6b0c3b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-frnxk" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.645944 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5cb86e02-0e22-4790-a22d-cf5be6b0c3b3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-frnxk\" (UID: \"5cb86e02-0e22-4790-a22d-cf5be6b0c3b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-frnxk" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.652438 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cb86e02-0e22-4790-a22d-cf5be6b0c3b3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-frnxk\" (UID: \"5cb86e02-0e22-4790-a22d-cf5be6b0c3b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-frnxk" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.664834 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5cb86e02-0e22-4790-a22d-cf5be6b0c3b3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-frnxk\" (UID: \"5cb86e02-0e22-4790-a22d-cf5be6b0c3b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-frnxk" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.670531 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-l5scf" podStartSLOduration=64.670507455 podStartE2EDuration="1m4.670507455s" podCreationTimestamp="2025-11-25 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:36:41.650913344 +0000 UTC m=+87.919187145" watchObservedRunningTime="2025-11-25 15:36:41.670507455 +0000 UTC m=+87.938781236" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.670952 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=68.670946878 podStartE2EDuration="1m8.670946878s" podCreationTimestamp="2025-11-25 15:35:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:36:41.670697361 +0000 UTC m=+87.938971142" watchObservedRunningTime="2025-11-25 15:36:41.670946878 +0000 UTC m=+87.939220659" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.697764 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-frnxk" Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.916313 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-frnxk" event={"ID":"5cb86e02-0e22-4790-a22d-cf5be6b0c3b3","Type":"ContainerStarted","Data":"804b5b90e307d28c4c83f6ecac3763da2b2dca9b1afdb942a71b532bf9cb583f"} Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.916364 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-frnxk" event={"ID":"5cb86e02-0e22-4790-a22d-cf5be6b0c3b3","Type":"ContainerStarted","Data":"a79cd010d4e66f90523c9335d2a57b40d0a11801100363d4707701bc6795f6b4"} Nov 25 15:36:41 crc kubenswrapper[4704]: I1125 15:36:41.931697 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-frnxk" podStartSLOduration=64.931672437 podStartE2EDuration="1m4.931672437s" podCreationTimestamp="2025-11-25 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:36:41.931039718 +0000 UTC m=+88.199313499" watchObservedRunningTime="2025-11-25 15:36:41.931672437 +0000 UTC m=+88.199946248" Nov 25 15:36:42 crc kubenswrapper[4704]: I1125 15:36:42.416065 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:42 crc kubenswrapper[4704]: E1125 15:36:42.416208 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:42 crc kubenswrapper[4704]: I1125 15:36:42.416477 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:42 crc kubenswrapper[4704]: E1125 15:36:42.416527 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:43 crc kubenswrapper[4704]: I1125 15:36:43.415444 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:43 crc kubenswrapper[4704]: I1125 15:36:43.415480 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:43 crc kubenswrapper[4704]: E1125 15:36:43.415600 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:36:43 crc kubenswrapper[4704]: E1125 15:36:43.415842 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:44 crc kubenswrapper[4704]: I1125 15:36:44.416029 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:44 crc kubenswrapper[4704]: I1125 15:36:44.416029 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:44 crc kubenswrapper[4704]: E1125 15:36:44.417961 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:44 crc kubenswrapper[4704]: E1125 15:36:44.418190 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:45 crc kubenswrapper[4704]: I1125 15:36:45.416337 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:45 crc kubenswrapper[4704]: I1125 15:36:45.416360 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:45 crc kubenswrapper[4704]: E1125 15:36:45.417581 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:36:45 crc kubenswrapper[4704]: E1125 15:36:45.417738 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:46 crc kubenswrapper[4704]: I1125 15:36:46.416432 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:46 crc kubenswrapper[4704]: I1125 15:36:46.416474 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:46 crc kubenswrapper[4704]: E1125 15:36:46.416646 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:46 crc kubenswrapper[4704]: E1125 15:36:46.417140 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:47 crc kubenswrapper[4704]: I1125 15:36:47.416330 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:47 crc kubenswrapper[4704]: I1125 15:36:47.416451 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:47 crc kubenswrapper[4704]: E1125 15:36:47.416468 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:36:47 crc kubenswrapper[4704]: E1125 15:36:47.416756 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:48 crc kubenswrapper[4704]: I1125 15:36:48.416297 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:48 crc kubenswrapper[4704]: I1125 15:36:48.416507 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:48 crc kubenswrapper[4704]: E1125 15:36:48.416603 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:48 crc kubenswrapper[4704]: E1125 15:36:48.416849 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:49 crc kubenswrapper[4704]: I1125 15:36:49.415444 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:49 crc kubenswrapper[4704]: I1125 15:36:49.415492 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:49 crc kubenswrapper[4704]: E1125 15:36:49.415611 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:36:49 crc kubenswrapper[4704]: E1125 15:36:49.415992 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:50 crc kubenswrapper[4704]: I1125 15:36:50.416959 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:50 crc kubenswrapper[4704]: I1125 15:36:50.417039 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:50 crc kubenswrapper[4704]: E1125 15:36:50.417105 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:50 crc kubenswrapper[4704]: E1125 15:36:50.417320 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:50 crc kubenswrapper[4704]: I1125 15:36:50.418353 4704 scope.go:117] "RemoveContainer" containerID="c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606" Nov 25 15:36:50 crc kubenswrapper[4704]: E1125 15:36:50.418567 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5kt46_openshift-ovn-kubernetes(f5274608-0c76-48d9-949d-53254df99b83)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" podUID="f5274608-0c76-48d9-949d-53254df99b83" Nov 25 15:36:51 crc kubenswrapper[4704]: I1125 15:36:51.416256 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:51 crc kubenswrapper[4704]: I1125 15:36:51.416325 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:51 crc kubenswrapper[4704]: E1125 15:36:51.416415 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:36:51 crc kubenswrapper[4704]: E1125 15:36:51.416520 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:52 crc kubenswrapper[4704]: I1125 15:36:52.416370 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:52 crc kubenswrapper[4704]: I1125 15:36:52.416457 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:52 crc kubenswrapper[4704]: E1125 15:36:52.416534 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:52 crc kubenswrapper[4704]: E1125 15:36:52.416834 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:53 crc kubenswrapper[4704]: I1125 15:36:53.415387 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:53 crc kubenswrapper[4704]: E1125 15:36:53.415615 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:36:53 crc kubenswrapper[4704]: I1125 15:36:53.415387 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:53 crc kubenswrapper[4704]: E1125 15:36:53.416132 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:54 crc kubenswrapper[4704]: I1125 15:36:54.416197 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:54 crc kubenswrapper[4704]: I1125 15:36:54.416262 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:54 crc kubenswrapper[4704]: E1125 15:36:54.416944 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:54 crc kubenswrapper[4704]: E1125 15:36:54.417043 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:55 crc kubenswrapper[4704]: I1125 15:36:55.416013 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:55 crc kubenswrapper[4704]: I1125 15:36:55.416177 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:55 crc kubenswrapper[4704]: E1125 15:36:55.416319 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:36:55 crc kubenswrapper[4704]: E1125 15:36:55.416524 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:56 crc kubenswrapper[4704]: I1125 15:36:56.204862 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-metrics-certs\") pod \"network-metrics-daemon-z6lnx\" (UID: \"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\") " pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:56 crc kubenswrapper[4704]: E1125 15:36:56.205036 4704 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:36:56 crc kubenswrapper[4704]: E1125 15:36:56.205095 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-metrics-certs podName:b9cf8fad-2f72-4a94-958b-dd58fc76f4df nodeName:}" failed. No retries permitted until 2025-11-25 15:38:00.205075773 +0000 UTC m=+166.473349554 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-metrics-certs") pod "network-metrics-daemon-z6lnx" (UID: "b9cf8fad-2f72-4a94-958b-dd58fc76f4df") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 15:36:56 crc kubenswrapper[4704]: I1125 15:36:56.416159 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:56 crc kubenswrapper[4704]: I1125 15:36:56.416318 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:56 crc kubenswrapper[4704]: E1125 15:36:56.416451 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:56 crc kubenswrapper[4704]: E1125 15:36:56.416550 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:57 crc kubenswrapper[4704]: I1125 15:36:57.416190 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:57 crc kubenswrapper[4704]: I1125 15:36:57.416261 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:57 crc kubenswrapper[4704]: E1125 15:36:57.416430 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:36:57 crc kubenswrapper[4704]: E1125 15:36:57.416574 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:58 crc kubenswrapper[4704]: I1125 15:36:58.415665 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:36:58 crc kubenswrapper[4704]: I1125 15:36:58.415743 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:36:58 crc kubenswrapper[4704]: E1125 15:36:58.415864 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:36:58 crc kubenswrapper[4704]: E1125 15:36:58.415974 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:36:59 crc kubenswrapper[4704]: I1125 15:36:59.415375 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:36:59 crc kubenswrapper[4704]: I1125 15:36:59.415428 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:36:59 crc kubenswrapper[4704]: E1125 15:36:59.415544 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:36:59 crc kubenswrapper[4704]: E1125 15:36:59.415764 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:37:00 crc kubenswrapper[4704]: I1125 15:37:00.416341 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:37:00 crc kubenswrapper[4704]: I1125 15:37:00.416402 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:37:00 crc kubenswrapper[4704]: E1125 15:37:00.416507 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:37:00 crc kubenswrapper[4704]: E1125 15:37:00.416614 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:37:01 crc kubenswrapper[4704]: I1125 15:37:01.416236 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:37:01 crc kubenswrapper[4704]: E1125 15:37:01.416412 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:37:01 crc kubenswrapper[4704]: I1125 15:37:01.416522 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:37:01 crc kubenswrapper[4704]: E1125 15:37:01.416842 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:37:02 crc kubenswrapper[4704]: I1125 15:37:02.416132 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:37:02 crc kubenswrapper[4704]: I1125 15:37:02.416245 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:37:02 crc kubenswrapper[4704]: E1125 15:37:02.416559 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:37:02 crc kubenswrapper[4704]: E1125 15:37:02.416678 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:37:03 crc kubenswrapper[4704]: I1125 15:37:03.416029 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:37:03 crc kubenswrapper[4704]: I1125 15:37:03.415998 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:37:03 crc kubenswrapper[4704]: E1125 15:37:03.416218 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:37:03 crc kubenswrapper[4704]: E1125 15:37:03.416366 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:37:04 crc kubenswrapper[4704]: I1125 15:37:04.415602 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:37:04 crc kubenswrapper[4704]: I1125 15:37:04.417680 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:37:04 crc kubenswrapper[4704]: E1125 15:37:04.417753 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:37:04 crc kubenswrapper[4704]: E1125 15:37:04.417624 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:37:05 crc kubenswrapper[4704]: I1125 15:37:05.416235 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:37:05 crc kubenswrapper[4704]: I1125 15:37:05.417205 4704 scope.go:117] "RemoveContainer" containerID="c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606" Nov 25 15:37:05 crc kubenswrapper[4704]: I1125 15:37:05.416274 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:37:05 crc kubenswrapper[4704]: E1125 15:37:05.417402 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5kt46_openshift-ovn-kubernetes(f5274608-0c76-48d9-949d-53254df99b83)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" podUID="f5274608-0c76-48d9-949d-53254df99b83" Nov 25 15:37:05 crc kubenswrapper[4704]: E1125 15:37:05.417417 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:37:05 crc kubenswrapper[4704]: E1125 15:37:05.417607 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:37:06 crc kubenswrapper[4704]: I1125 15:37:06.416360 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:37:06 crc kubenswrapper[4704]: I1125 15:37:06.416460 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:37:06 crc kubenswrapper[4704]: E1125 15:37:06.417285 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:37:06 crc kubenswrapper[4704]: E1125 15:37:06.417470 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:37:07 crc kubenswrapper[4704]: I1125 15:37:07.416210 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:37:07 crc kubenswrapper[4704]: I1125 15:37:07.416164 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:37:07 crc kubenswrapper[4704]: E1125 15:37:07.417619 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:37:07 crc kubenswrapper[4704]: E1125 15:37:07.417706 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:37:08 crc kubenswrapper[4704]: I1125 15:37:08.416264 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:37:08 crc kubenswrapper[4704]: I1125 15:37:08.416300 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:37:08 crc kubenswrapper[4704]: E1125 15:37:08.417379 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:37:08 crc kubenswrapper[4704]: E1125 15:37:08.417563 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:37:09 crc kubenswrapper[4704]: I1125 15:37:09.415689 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:37:09 crc kubenswrapper[4704]: E1125 15:37:09.415897 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:37:09 crc kubenswrapper[4704]: I1125 15:37:09.415978 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:37:09 crc kubenswrapper[4704]: E1125 15:37:09.416145 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:37:10 crc kubenswrapper[4704]: I1125 15:37:10.415891 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:37:10 crc kubenswrapper[4704]: I1125 15:37:10.415997 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:37:10 crc kubenswrapper[4704]: E1125 15:37:10.416073 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:37:10 crc kubenswrapper[4704]: E1125 15:37:10.416223 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:37:11 crc kubenswrapper[4704]: I1125 15:37:11.416146 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:37:11 crc kubenswrapper[4704]: E1125 15:37:11.417184 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:37:11 crc kubenswrapper[4704]: I1125 15:37:11.416226 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:37:11 crc kubenswrapper[4704]: E1125 15:37:11.417724 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:37:12 crc kubenswrapper[4704]: I1125 15:37:12.006458 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h92xm_d2820ade-e9bd-4146-b275-0c3b7d0cb5aa/kube-multus/1.log" Nov 25 15:37:12 crc kubenswrapper[4704]: I1125 15:37:12.006921 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h92xm_d2820ade-e9bd-4146-b275-0c3b7d0cb5aa/kube-multus/0.log" Nov 25 15:37:12 crc kubenswrapper[4704]: I1125 15:37:12.006966 4704 generic.go:334] "Generic (PLEG): container finished" podID="d2820ade-e9bd-4146-b275-0c3b7d0cb5aa" containerID="89cb23cc625602134c0e14c76fa545386707fdb2815ffbc0563737edc6552ff0" exitCode=1 Nov 25 15:37:12 crc kubenswrapper[4704]: I1125 15:37:12.007002 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h92xm" event={"ID":"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa","Type":"ContainerDied","Data":"89cb23cc625602134c0e14c76fa545386707fdb2815ffbc0563737edc6552ff0"} Nov 25 15:37:12 crc kubenswrapper[4704]: I1125 15:37:12.007051 4704 scope.go:117] "RemoveContainer" containerID="d21f39a273f7ff88ad4de9fc3cfc003554bebdd6953e836c031bebe27670ca59" Nov 25 15:37:12 crc kubenswrapper[4704]: I1125 15:37:12.007692 4704 scope.go:117] "RemoveContainer" containerID="89cb23cc625602134c0e14c76fa545386707fdb2815ffbc0563737edc6552ff0" Nov 25 15:37:12 crc kubenswrapper[4704]: E1125 15:37:12.008219 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-h92xm_openshift-multus(d2820ade-e9bd-4146-b275-0c3b7d0cb5aa)\"" pod="openshift-multus/multus-h92xm" podUID="d2820ade-e9bd-4146-b275-0c3b7d0cb5aa" Nov 25 15:37:12 crc kubenswrapper[4704]: I1125 15:37:12.415954 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:37:12 crc kubenswrapper[4704]: I1125 15:37:12.416083 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:37:12 crc kubenswrapper[4704]: E1125 15:37:12.416214 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:37:12 crc kubenswrapper[4704]: E1125 15:37:12.416255 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:37:13 crc kubenswrapper[4704]: I1125 15:37:13.011648 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h92xm_d2820ade-e9bd-4146-b275-0c3b7d0cb5aa/kube-multus/1.log" Nov 25 15:37:13 crc kubenswrapper[4704]: I1125 15:37:13.415944 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:37:13 crc kubenswrapper[4704]: E1125 15:37:13.416125 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:37:13 crc kubenswrapper[4704]: I1125 15:37:13.416478 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:37:13 crc kubenswrapper[4704]: E1125 15:37:13.416684 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:37:14 crc kubenswrapper[4704]: E1125 15:37:14.414269 4704 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Nov 25 15:37:14 crc kubenswrapper[4704]: I1125 15:37:14.416528 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:37:14 crc kubenswrapper[4704]: E1125 15:37:14.417699 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:37:14 crc kubenswrapper[4704]: I1125 15:37:14.418040 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:37:14 crc kubenswrapper[4704]: E1125 15:37:14.418121 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:37:14 crc kubenswrapper[4704]: E1125 15:37:14.531225 4704 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 25 15:37:15 crc kubenswrapper[4704]: I1125 15:37:15.415461 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:37:15 crc kubenswrapper[4704]: I1125 15:37:15.415524 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:37:15 crc kubenswrapper[4704]: E1125 15:37:15.415639 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:37:15 crc kubenswrapper[4704]: E1125 15:37:15.415745 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:37:16 crc kubenswrapper[4704]: I1125 15:37:16.416146 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:37:16 crc kubenswrapper[4704]: I1125 15:37:16.416701 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:37:16 crc kubenswrapper[4704]: E1125 15:37:16.416840 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:37:16 crc kubenswrapper[4704]: E1125 15:37:16.416926 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:37:17 crc kubenswrapper[4704]: I1125 15:37:17.415729 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:37:17 crc kubenswrapper[4704]: E1125 15:37:17.415923 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:37:17 crc kubenswrapper[4704]: I1125 15:37:17.415946 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:37:17 crc kubenswrapper[4704]: E1125 15:37:17.416139 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:37:17 crc kubenswrapper[4704]: I1125 15:37:17.417417 4704 scope.go:117] "RemoveContainer" containerID="c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606" Nov 25 15:37:18 crc kubenswrapper[4704]: I1125 15:37:18.031039 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5kt46_f5274608-0c76-48d9-949d-53254df99b83/ovnkube-controller/3.log" Nov 25 15:37:18 crc kubenswrapper[4704]: I1125 15:37:18.034292 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" event={"ID":"f5274608-0c76-48d9-949d-53254df99b83","Type":"ContainerStarted","Data":"1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082"} Nov 25 15:37:18 crc kubenswrapper[4704]: I1125 15:37:18.034747 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:37:18 crc kubenswrapper[4704]: I1125 15:37:18.062195 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" podStartSLOduration=101.06217501 podStartE2EDuration="1m41.06217501s" podCreationTimestamp="2025-11-25 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:18.060886103 +0000 UTC m=+124.329159884" watchObservedRunningTime="2025-11-25 15:37:18.06217501 +0000 UTC m=+124.330448791" Nov 25 15:37:18 crc kubenswrapper[4704]: I1125 15:37:18.270665 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-z6lnx"] Nov 25 15:37:18 crc kubenswrapper[4704]: I1125 15:37:18.270820 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:37:18 crc kubenswrapper[4704]: E1125 15:37:18.270992 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:37:18 crc kubenswrapper[4704]: I1125 15:37:18.416617 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:37:18 crc kubenswrapper[4704]: E1125 15:37:18.416809 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:37:18 crc kubenswrapper[4704]: I1125 15:37:18.416893 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:37:18 crc kubenswrapper[4704]: E1125 15:37:18.417134 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:37:19 crc kubenswrapper[4704]: I1125 15:37:19.415635 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:37:19 crc kubenswrapper[4704]: E1125 15:37:19.415854 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:37:19 crc kubenswrapper[4704]: E1125 15:37:19.533264 4704 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 25 15:37:20 crc kubenswrapper[4704]: I1125 15:37:20.415857 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:37:20 crc kubenswrapper[4704]: E1125 15:37:20.416006 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:37:20 crc kubenswrapper[4704]: I1125 15:37:20.416043 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:37:20 crc kubenswrapper[4704]: I1125 15:37:20.416062 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:37:20 crc kubenswrapper[4704]: E1125 15:37:20.416275 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:37:20 crc kubenswrapper[4704]: E1125 15:37:20.416324 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:37:21 crc kubenswrapper[4704]: I1125 15:37:21.415757 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:37:21 crc kubenswrapper[4704]: E1125 15:37:21.415958 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:37:22 crc kubenswrapper[4704]: I1125 15:37:22.415819 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:37:22 crc kubenswrapper[4704]: E1125 15:37:22.415979 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:37:22 crc kubenswrapper[4704]: I1125 15:37:22.416180 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:37:22 crc kubenswrapper[4704]: E1125 15:37:22.416231 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:37:22 crc kubenswrapper[4704]: I1125 15:37:22.416345 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:37:22 crc kubenswrapper[4704]: E1125 15:37:22.416398 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:37:23 crc kubenswrapper[4704]: I1125 15:37:23.416454 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:37:23 crc kubenswrapper[4704]: I1125 15:37:23.416582 4704 scope.go:117] "RemoveContainer" containerID="89cb23cc625602134c0e14c76fa545386707fdb2815ffbc0563737edc6552ff0" Nov 25 15:37:23 crc kubenswrapper[4704]: E1125 15:37:23.416605 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:37:24 crc kubenswrapper[4704]: I1125 15:37:24.053735 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h92xm_d2820ade-e9bd-4146-b275-0c3b7d0cb5aa/kube-multus/1.log" Nov 25 15:37:24 crc kubenswrapper[4704]: I1125 15:37:24.054241 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h92xm" event={"ID":"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa","Type":"ContainerStarted","Data":"6ebde3bce3ebb98df82c1e2217d50256663339143d5a82ad4958eeed412b4c81"} Nov 25 15:37:24 crc kubenswrapper[4704]: I1125 15:37:24.415978 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:37:24 crc kubenswrapper[4704]: I1125 15:37:24.416020 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:37:24 crc kubenswrapper[4704]: E1125 15:37:24.417384 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:37:24 crc kubenswrapper[4704]: I1125 15:37:24.417413 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:37:24 crc kubenswrapper[4704]: E1125 15:37:24.417490 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:37:24 crc kubenswrapper[4704]: E1125 15:37:24.417561 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:37:24 crc kubenswrapper[4704]: E1125 15:37:24.533871 4704 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 25 15:37:25 crc kubenswrapper[4704]: I1125 15:37:25.415860 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:37:25 crc kubenswrapper[4704]: E1125 15:37:25.416386 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:37:26 crc kubenswrapper[4704]: I1125 15:37:26.418010 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:37:26 crc kubenswrapper[4704]: E1125 15:37:26.418173 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:37:26 crc kubenswrapper[4704]: I1125 15:37:26.418390 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:37:26 crc kubenswrapper[4704]: E1125 15:37:26.418460 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:37:26 crc kubenswrapper[4704]: I1125 15:37:26.418576 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:37:26 crc kubenswrapper[4704]: E1125 15:37:26.418619 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:37:27 crc kubenswrapper[4704]: I1125 15:37:27.416161 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:37:27 crc kubenswrapper[4704]: E1125 15:37:27.416333 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:37:28 crc kubenswrapper[4704]: I1125 15:37:28.415702 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:37:28 crc kubenswrapper[4704]: I1125 15:37:28.415769 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:37:28 crc kubenswrapper[4704]: E1125 15:37:28.415879 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 15:37:28 crc kubenswrapper[4704]: I1125 15:37:28.416005 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:37:28 crc kubenswrapper[4704]: E1125 15:37:28.416126 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 15:37:28 crc kubenswrapper[4704]: E1125 15:37:28.416407 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z6lnx" podUID="b9cf8fad-2f72-4a94-958b-dd58fc76f4df" Nov 25 15:37:29 crc kubenswrapper[4704]: I1125 15:37:29.415395 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:37:29 crc kubenswrapper[4704]: E1125 15:37:29.415568 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 15:37:30 crc kubenswrapper[4704]: I1125 15:37:30.416267 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:37:30 crc kubenswrapper[4704]: I1125 15:37:30.416267 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:37:30 crc kubenswrapper[4704]: I1125 15:37:30.416291 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:37:30 crc kubenswrapper[4704]: I1125 15:37:30.422528 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 25 15:37:30 crc kubenswrapper[4704]: I1125 15:37:30.422632 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 25 15:37:30 crc kubenswrapper[4704]: I1125 15:37:30.422758 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 25 15:37:30 crc kubenswrapper[4704]: I1125 15:37:30.423429 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 25 15:37:30 crc kubenswrapper[4704]: I1125 15:37:30.423435 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 25 15:37:30 crc kubenswrapper[4704]: I1125 15:37:30.423590 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 25 15:37:31 crc kubenswrapper[4704]: I1125 15:37:31.416409 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.774924 4704 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.831181 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fz52t"] Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.831720 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-fz52t" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.840433 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-c7s9c"] Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.841217 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c7s9c" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.842073 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5llzt"] Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.842781 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:32 crc kubenswrapper[4704]: W1125 15:37:32.850084 4704 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7": failed to list *v1.Secret: secrets "machine-api-operator-dockercfg-mfbb7" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Nov 25 15:37:32 crc kubenswrapper[4704]: E1125 15:37:32.850142 4704 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-dockercfg-mfbb7\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-api-operator-dockercfg-mfbb7\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 25 15:37:32 crc kubenswrapper[4704]: W1125 15:37:32.850262 4704 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-images": failed to list *v1.ConfigMap: configmaps "machine-api-operator-images" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Nov 25 15:37:32 crc kubenswrapper[4704]: E1125 15:37:32.850284 4704 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-images\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"machine-api-operator-images\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 25 15:37:32 crc kubenswrapper[4704]: W1125 15:37:32.852399 4704 reflector.go:561] object-"openshift-machine-api"/"kube-rbac-proxy": failed to list *v1.ConfigMap: configmaps "kube-rbac-proxy" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Nov 25 15:37:32 crc kubenswrapper[4704]: E1125 15:37:32.852622 4704 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"kube-rbac-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 25 15:37:32 crc kubenswrapper[4704]: W1125 15:37:32.855948 4704 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-tls": failed to list *v1.Secret: secrets "machine-api-operator-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Nov 25 15:37:32 crc kubenswrapper[4704]: E1125 15:37:32.856133 4704 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-api-operator-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 25 15:37:32 crc kubenswrapper[4704]: W1125 15:37:32.855953 4704 reflector.go:561] object-"openshift-machine-api"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Nov 25 15:37:32 crc kubenswrapper[4704]: E1125 15:37:32.856288 4704 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.857456 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7q5r8"] Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.858074 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7q5r8" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.859394 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zvh6p"] Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.859815 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zvh6p" Nov 25 15:37:32 crc kubenswrapper[4704]: W1125 15:37:32.860127 4704 reflector.go:561] object-"openshift-machine-api"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Nov 25 15:37:32 crc kubenswrapper[4704]: E1125 15:37:32.860192 4704 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.860707 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gc5rd"] Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.861255 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.861377 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-z692d"] Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.866696 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.870208 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk"] Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.870973 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.876622 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s"] Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.877294 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h7k8v"] Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.877998 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pjf9j"] Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.878511 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-pjf9j" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.878641 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.879615 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-gbzgh"] Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.879718 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h7k8v" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.880647 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-gbzgh" Nov 25 15:37:32 crc kubenswrapper[4704]: W1125 15:37:32.894983 4704 reflector.go:561] object-"openshift-cluster-machine-approver"/"machine-approver-config": failed to list *v1.ConfigMap: configmaps "machine-approver-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Nov 25 15:37:32 crc kubenswrapper[4704]: E1125 15:37:32.895314 4704 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"machine-approver-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"machine-approver-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.895071 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.902367 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.905672 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gptmb"] Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.906526 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-7pmpw"] Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.906776 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gptmb" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.907170 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7pmpw" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.916520 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4b54a115-ed61-47a7-b447-400aa4f75b1b-machine-approver-tls\") pod \"machine-approver-56656f9798-c7s9c\" (UID: \"4b54a115-ed61-47a7-b447-400aa4f75b1b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c7s9c" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.916561 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0d6182fe-6ca6-4384-99ce-501079ac58ad-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gptmb\" (UID: \"0d6182fe-6ca6-4384-99ce-501079ac58ad\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gptmb" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.916578 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-image-import-ca\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.916596 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d60f2c5d-3a97-47a6-a311-dffb74233746-config\") pod \"authentication-operator-69f744f599-pjf9j\" (UID: \"d60f2c5d-3a97-47a6-a311-dffb74233746\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjf9j" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.916612 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d60f2c5d-3a97-47a6-a311-dffb74233746-service-ca-bundle\") pod \"authentication-operator-69f744f599-pjf9j\" (UID: \"d60f2c5d-3a97-47a6-a311-dffb74233746\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjf9j" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.916627 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6qm8\" (UniqueName: \"kubernetes.io/projected/4b54a115-ed61-47a7-b447-400aa4f75b1b-kube-api-access-r6qm8\") pod \"machine-approver-56656f9798-c7s9c\" (UID: \"4b54a115-ed61-47a7-b447-400aa4f75b1b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c7s9c" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.916643 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2b9c93c0-005e-4b54-a498-a4ae8418f839-etcd-client\") pod \"apiserver-7bbb656c7d-cqkjk\" (UID: \"2b9c93c0-005e-4b54-a498-a4ae8418f839\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.916658 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.916675 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4b54a115-ed61-47a7-b447-400aa4f75b1b-auth-proxy-config\") pod \"machine-approver-56656f9798-c7s9c\" (UID: \"4b54a115-ed61-47a7-b447-400aa4f75b1b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c7s9c" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.916690 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-audit\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.916705 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7b416e1d-da7d-4da7-9bae-210c815d4cf1-images\") pod \"machine-api-operator-5694c8668f-fz52t\" (UID: \"7b416e1d-da7d-4da7-9bae-210c815d4cf1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fz52t" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.916717 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/17320021-32dc-4bef-befa-fa0a7c2b8533-audit-dir\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.916732 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-node-pullsecrets\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.916746 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.916761 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-audit-dir\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.916775 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e167ba8-a633-42df-963a-913ba4fe20bf-oauth-serving-cert\") pod \"console-f9d7485db-7pmpw\" (UID: \"6e167ba8-a633-42df-963a-913ba4fe20bf\") " pod="openshift-console/console-f9d7485db-7pmpw" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.916879 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clsmr\" (UniqueName: \"kubernetes.io/projected/6a92f740-f168-4d3b-b225-a73109091d7d-kube-api-access-clsmr\") pod \"cluster-samples-operator-665b6dd947-h7k8v\" (UID: \"6a92f740-f168-4d3b-b225-a73109091d7d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h7k8v" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.916897 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nsjv\" (UniqueName: \"kubernetes.io/projected/fe8e9530-3977-4dc5-abe0-f8c655b58f6a-kube-api-access-4nsjv\") pod \"downloads-7954f5f757-gbzgh\" (UID: \"fe8e9530-3977-4dc5-abe0-f8c655b58f6a\") " pod="openshift-console/downloads-7954f5f757-gbzgh" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.916912 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.916929 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2b9c93c0-005e-4b54-a498-a4ae8418f839-audit-dir\") pod \"apiserver-7bbb656c7d-cqkjk\" (UID: \"2b9c93c0-005e-4b54-a498-a4ae8418f839\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.916943 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4h6l\" (UniqueName: \"kubernetes.io/projected/42554b00-c5ca-41d5-b84e-af36e56239c6-kube-api-access-l4h6l\") pod \"openshift-config-operator-7777fb866f-7q5r8\" (UID: \"42554b00-c5ca-41d5-b84e-af36e56239c6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7q5r8" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.916961 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ce56dcb-a916-41ca-b706-df5e157576eb-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-gc5rd\" (UID: \"4ce56dcb-a916-41ca-b706-df5e157576eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.916978 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txnfx\" (UniqueName: \"kubernetes.io/projected/7b416e1d-da7d-4da7-9bae-210c815d4cf1-kube-api-access-txnfx\") pod \"machine-api-operator-5694c8668f-fz52t\" (UID: \"7b416e1d-da7d-4da7-9bae-210c815d4cf1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fz52t" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.916997 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ce56dcb-a916-41ca-b706-df5e157576eb-serving-cert\") pod \"controller-manager-879f6c89f-gc5rd\" (UID: \"4ce56dcb-a916-41ca-b706-df5e157576eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917013 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917030 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ce56dcb-a916-41ca-b706-df5e157576eb-config\") pod \"controller-manager-879f6c89f-gc5rd\" (UID: \"4ce56dcb-a916-41ca-b706-df5e157576eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917049 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lk4n\" (UniqueName: \"kubernetes.io/projected/2b9c93c0-005e-4b54-a498-a4ae8418f839-kube-api-access-9lk4n\") pod \"apiserver-7bbb656c7d-cqkjk\" (UID: \"2b9c93c0-005e-4b54-a498-a4ae8418f839\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917063 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-encryption-config\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917091 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917108 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917123 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23e9d749-57d6-4ce1-a899-44b745738978-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zvh6p\" (UID: \"23e9d749-57d6-4ce1-a899-44b745738978\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zvh6p" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917142 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23e9d749-57d6-4ce1-a899-44b745738978-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zvh6p\" (UID: \"23e9d749-57d6-4ce1-a899-44b745738978\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zvh6p" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917158 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7b416e1d-da7d-4da7-9bae-210c815d4cf1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fz52t\" (UID: \"7b416e1d-da7d-4da7-9bae-210c815d4cf1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fz52t" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917173 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917201 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drjkd\" (UniqueName: \"kubernetes.io/projected/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-kube-api-access-drjkd\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917215 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b416e1d-da7d-4da7-9bae-210c815d4cf1-config\") pod \"machine-api-operator-5694c8668f-fz52t\" (UID: \"7b416e1d-da7d-4da7-9bae-210c815d4cf1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fz52t" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917230 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e167ba8-a633-42df-963a-913ba4fe20bf-console-serving-cert\") pod \"console-f9d7485db-7pmpw\" (UID: \"6e167ba8-a633-42df-963a-913ba4fe20bf\") " pod="openshift-console/console-f9d7485db-7pmpw" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917244 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e167ba8-a633-42df-963a-913ba4fe20bf-service-ca\") pod \"console-f9d7485db-7pmpw\" (UID: \"6e167ba8-a633-42df-963a-913ba4fe20bf\") " pod="openshift-console/console-f9d7485db-7pmpw" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917260 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917285 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d60f2c5d-3a97-47a6-a311-dffb74233746-serving-cert\") pod \"authentication-operator-69f744f599-pjf9j\" (UID: \"d60f2c5d-3a97-47a6-a311-dffb74233746\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjf9j" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917299 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hb67\" (UniqueName: \"kubernetes.io/projected/23e9d749-57d6-4ce1-a899-44b745738978-kube-api-access-4hb67\") pod \"openshift-apiserver-operator-796bbdcf4f-zvh6p\" (UID: \"23e9d749-57d6-4ce1-a899-44b745738978\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zvh6p" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917316 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-config\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917330 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a92f740-f168-4d3b-b225-a73109091d7d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-h7k8v\" (UID: \"6a92f740-f168-4d3b-b225-a73109091d7d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h7k8v" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917344 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917360 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbzd6\" (UniqueName: \"kubernetes.io/projected/4ce56dcb-a916-41ca-b706-df5e157576eb-kube-api-access-bbzd6\") pod \"controller-manager-879f6c89f-gc5rd\" (UID: \"4ce56dcb-a916-41ca-b706-df5e157576eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917374 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2b9c93c0-005e-4b54-a498-a4ae8418f839-audit-policies\") pod \"apiserver-7bbb656c7d-cqkjk\" (UID: \"2b9c93c0-005e-4b54-a498-a4ae8418f839\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917390 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2b9c93c0-005e-4b54-a498-a4ae8418f839-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-cqkjk\" (UID: \"2b9c93c0-005e-4b54-a498-a4ae8418f839\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917404 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-etcd-client\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917418 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e167ba8-a633-42df-963a-913ba4fe20bf-console-config\") pod \"console-f9d7485db-7pmpw\" (UID: \"6e167ba8-a633-42df-963a-913ba4fe20bf\") " pod="openshift-console/console-f9d7485db-7pmpw" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917442 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ce56dcb-a916-41ca-b706-df5e157576eb-client-ca\") pod \"controller-manager-879f6c89f-gc5rd\" (UID: \"4ce56dcb-a916-41ca-b706-df5e157576eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917456 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-serving-cert\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917470 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/42554b00-c5ca-41d5-b84e-af36e56239c6-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7q5r8\" (UID: \"42554b00-c5ca-41d5-b84e-af36e56239c6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7q5r8" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917487 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0d6182fe-6ca6-4384-99ce-501079ac58ad-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gptmb\" (UID: \"0d6182fe-6ca6-4384-99ce-501079ac58ad\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gptmb" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917502 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917519 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917539 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d60f2c5d-3a97-47a6-a311-dffb74233746-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pjf9j\" (UID: \"d60f2c5d-3a97-47a6-a311-dffb74233746\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjf9j" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917553 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49cc56f5-b9bb-4694-8965-0e7c6a4aaae6-serving-cert\") pod \"route-controller-manager-6576b87f9c-6s99s\" (UID: \"49cc56f5-b9bb-4694-8965-0e7c6a4aaae6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917568 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2b9c93c0-005e-4b54-a498-a4ae8418f839-encryption-config\") pod \"apiserver-7bbb656c7d-cqkjk\" (UID: \"2b9c93c0-005e-4b54-a498-a4ae8418f839\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917584 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42554b00-c5ca-41d5-b84e-af36e56239c6-serving-cert\") pod \"openshift-config-operator-7777fb866f-7q5r8\" (UID: \"42554b00-c5ca-41d5-b84e-af36e56239c6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7q5r8" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917600 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzzsp\" (UniqueName: \"kubernetes.io/projected/d60f2c5d-3a97-47a6-a311-dffb74233746-kube-api-access-wzzsp\") pod \"authentication-operator-69f744f599-pjf9j\" (UID: \"d60f2c5d-3a97-47a6-a311-dffb74233746\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjf9j" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917615 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-etcd-serving-ca\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917631 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b54a115-ed61-47a7-b447-400aa4f75b1b-config\") pod \"machine-approver-56656f9798-c7s9c\" (UID: \"4b54a115-ed61-47a7-b447-400aa4f75b1b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c7s9c" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917645 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxncn\" (UniqueName: \"kubernetes.io/projected/6e167ba8-a633-42df-963a-913ba4fe20bf-kube-api-access-kxncn\") pod \"console-f9d7485db-7pmpw\" (UID: \"6e167ba8-a633-42df-963a-913ba4fe20bf\") " pod="openshift-console/console-f9d7485db-7pmpw" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917659 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49cc56f5-b9bb-4694-8965-0e7c6a4aaae6-config\") pod \"route-controller-manager-6576b87f9c-6s99s\" (UID: \"49cc56f5-b9bb-4694-8965-0e7c6a4aaae6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917673 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49cc56f5-b9bb-4694-8965-0e7c6a4aaae6-client-ca\") pod \"route-controller-manager-6576b87f9c-6s99s\" (UID: \"49cc56f5-b9bb-4694-8965-0e7c6a4aaae6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917689 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b9c93c0-005e-4b54-a498-a4ae8418f839-serving-cert\") pod \"apiserver-7bbb656c7d-cqkjk\" (UID: \"2b9c93c0-005e-4b54-a498-a4ae8418f839\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917703 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/17320021-32dc-4bef-befa-fa0a7c2b8533-audit-policies\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917726 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b9c93c0-005e-4b54-a498-a4ae8418f839-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-cqkjk\" (UID: \"2b9c93c0-005e-4b54-a498-a4ae8418f839\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917741 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917756 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n8z2\" (UniqueName: \"kubernetes.io/projected/17320021-32dc-4bef-befa-fa0a7c2b8533-kube-api-access-2n8z2\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917781 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d6182fe-6ca6-4384-99ce-501079ac58ad-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gptmb\" (UID: \"0d6182fe-6ca6-4384-99ce-501079ac58ad\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gptmb" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917814 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddvj2\" (UniqueName: \"kubernetes.io/projected/0d6182fe-6ca6-4384-99ce-501079ac58ad-kube-api-access-ddvj2\") pod \"cluster-image-registry-operator-dc59b4c8b-gptmb\" (UID: \"0d6182fe-6ca6-4384-99ce-501079ac58ad\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gptmb" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917867 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e167ba8-a633-42df-963a-913ba4fe20bf-trusted-ca-bundle\") pod \"console-f9d7485db-7pmpw\" (UID: \"6e167ba8-a633-42df-963a-913ba4fe20bf\") " pod="openshift-console/console-f9d7485db-7pmpw" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917884 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6sq2\" (UniqueName: \"kubernetes.io/projected/49cc56f5-b9bb-4694-8965-0e7c6a4aaae6-kube-api-access-l6sq2\") pod \"route-controller-manager-6576b87f9c-6s99s\" (UID: \"49cc56f5-b9bb-4694-8965-0e7c6a4aaae6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.917907 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e167ba8-a633-42df-963a-913ba4fe20bf-console-oauth-config\") pod \"console-f9d7485db-7pmpw\" (UID: \"6e167ba8-a633-42df-963a-913ba4fe20bf\") " pod="openshift-console/console-f9d7485db-7pmpw" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.942455 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.943070 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.944078 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.944365 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.944626 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.944855 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.945092 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.945192 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.945250 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.945439 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.945508 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.945622 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.945733 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.945934 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.945440 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.946075 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.946177 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.946268 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.946361 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.946825 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.948002 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.948173 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-nd74n"] Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.948818 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-nd74n" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.949458 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.950007 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.950078 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.955729 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.956299 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.956714 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.956873 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.956999 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.957080 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.957829 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.957981 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.958535 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qb7gf"] Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.959207 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qxl8w"] Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.959888 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-qxl8w" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.960263 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-p5r8t"] Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.960521 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.960760 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-p5r8t" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.962012 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.963119 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.963423 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wj7zx"] Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.964255 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wj7zx" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.964649 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gff7t"] Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.965379 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gff7t" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.966252 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.966354 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.966542 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.966662 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.966764 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.966859 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.966251 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.967058 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.967145 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fz52t"] Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.967318 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.967527 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.967549 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tbhnz"] Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.974187 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tbhnz" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.975836 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zvh6p"] Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.975906 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mm4pt"] Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.976641 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7q5r8"] Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.976770 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mm4pt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.977128 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.977502 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xqbjw"] Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.978075 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.978583 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xqbjw" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.979031 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.979240 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.979344 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.979499 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.979611 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.979713 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.982379 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.997214 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d8lwt"] Nov 25 15:37:32 crc kubenswrapper[4704]: I1125 15:37:32.998286 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d8lwt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.000053 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.000430 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.000685 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.001003 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.001770 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.002246 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.002718 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.003192 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.003434 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.003530 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.003613 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.003686 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.004693 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.006253 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.023602 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.024301 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.024999 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.025260 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.025731 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.025854 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.025906 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.026182 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.026343 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.026406 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.026514 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.026692 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.027497 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.028616 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.030367 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.032448 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kklq9"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.040427 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.040402 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drjkd\" (UniqueName: \"kubernetes.io/projected/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-kube-api-access-drjkd\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.040985 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b416e1d-da7d-4da7-9bae-210c815d4cf1-config\") pod \"machine-api-operator-5694c8668f-fz52t\" (UID: \"7b416e1d-da7d-4da7-9bae-210c815d4cf1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fz52t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.041164 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e167ba8-a633-42df-963a-913ba4fe20bf-console-serving-cert\") pod \"console-f9d7485db-7pmpw\" (UID: \"6e167ba8-a633-42df-963a-913ba4fe20bf\") " pod="openshift-console/console-f9d7485db-7pmpw" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.041314 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e167ba8-a633-42df-963a-913ba4fe20bf-service-ca\") pod \"console-f9d7485db-7pmpw\" (UID: \"6e167ba8-a633-42df-963a-913ba4fe20bf\") " pod="openshift-console/console-f9d7485db-7pmpw" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.041350 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.041511 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-config\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.041655 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a92f740-f168-4d3b-b225-a73109091d7d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-h7k8v\" (UID: \"6a92f740-f168-4d3b-b225-a73109091d7d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h7k8v" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.042729 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-config\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.044068 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.044258 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.044469 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.044472 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.045116 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d60f2c5d-3a97-47a6-a311-dffb74233746-serving-cert\") pod \"authentication-operator-69f744f599-pjf9j\" (UID: \"d60f2c5d-3a97-47a6-a311-dffb74233746\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjf9j" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.045166 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hb67\" (UniqueName: \"kubernetes.io/projected/23e9d749-57d6-4ce1-a899-44b745738978-kube-api-access-4hb67\") pod \"openshift-apiserver-operator-796bbdcf4f-zvh6p\" (UID: \"23e9d749-57d6-4ce1-a899-44b745738978\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zvh6p" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.048330 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.049363 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e167ba8-a633-42df-963a-913ba4fe20bf-service-ca\") pod \"console-f9d7485db-7pmpw\" (UID: \"6e167ba8-a633-42df-963a-913ba4fe20bf\") " pod="openshift-console/console-f9d7485db-7pmpw" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.049577 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2b9c93c0-005e-4b54-a498-a4ae8418f839-audit-policies\") pod \"apiserver-7bbb656c7d-cqkjk\" (UID: \"2b9c93c0-005e-4b54-a498-a4ae8418f839\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.050017 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2b9c93c0-005e-4b54-a498-a4ae8418f839-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-cqkjk\" (UID: \"2b9c93c0-005e-4b54-a498-a4ae8418f839\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.050549 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.051016 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4h6ww"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.053774 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-etcd-client\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.053849 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e167ba8-a633-42df-963a-913ba4fe20bf-console-config\") pod \"console-f9d7485db-7pmpw\" (UID: \"6e167ba8-a633-42df-963a-913ba4fe20bf\") " pod="openshift-console/console-f9d7485db-7pmpw" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.053959 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbzd6\" (UniqueName: \"kubernetes.io/projected/4ce56dcb-a916-41ca-b706-df5e157576eb-kube-api-access-bbzd6\") pod \"controller-manager-879f6c89f-gc5rd\" (UID: \"4ce56dcb-a916-41ca-b706-df5e157576eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.054072 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ce56dcb-a916-41ca-b706-df5e157576eb-client-ca\") pod \"controller-manager-879f6c89f-gc5rd\" (UID: \"4ce56dcb-a916-41ca-b706-df5e157576eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.054099 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-serving-cert\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.054125 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/42554b00-c5ca-41d5-b84e-af36e56239c6-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7q5r8\" (UID: \"42554b00-c5ca-41d5-b84e-af36e56239c6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7q5r8" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.054269 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.055228 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.055410 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0d6182fe-6ca6-4384-99ce-501079ac58ad-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gptmb\" (UID: \"0d6182fe-6ca6-4384-99ce-501079ac58ad\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gptmb" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.055440 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.055737 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49cc56f5-b9bb-4694-8965-0e7c6a4aaae6-serving-cert\") pod \"route-controller-manager-6576b87f9c-6s99s\" (UID: \"49cc56f5-b9bb-4694-8965-0e7c6a4aaae6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.056057 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d60f2c5d-3a97-47a6-a311-dffb74233746-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pjf9j\" (UID: \"d60f2c5d-3a97-47a6-a311-dffb74233746\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjf9j" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.056236 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2b9c93c0-005e-4b54-a498-a4ae8418f839-encryption-config\") pod \"apiserver-7bbb656c7d-cqkjk\" (UID: \"2b9c93c0-005e-4b54-a498-a4ae8418f839\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.056385 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42554b00-c5ca-41d5-b84e-af36e56239c6-serving-cert\") pod \"openshift-config-operator-7777fb866f-7q5r8\" (UID: \"42554b00-c5ca-41d5-b84e-af36e56239c6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7q5r8" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.056417 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzzsp\" (UniqueName: \"kubernetes.io/projected/d60f2c5d-3a97-47a6-a311-dffb74233746-kube-api-access-wzzsp\") pod \"authentication-operator-69f744f599-pjf9j\" (UID: \"d60f2c5d-3a97-47a6-a311-dffb74233746\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjf9j" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.056562 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-etcd-serving-ca\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.056591 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49cc56f5-b9bb-4694-8965-0e7c6a4aaae6-config\") pod \"route-controller-manager-6576b87f9c-6s99s\" (UID: \"49cc56f5-b9bb-4694-8965-0e7c6a4aaae6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.056735 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49cc56f5-b9bb-4694-8965-0e7c6a4aaae6-client-ca\") pod \"route-controller-manager-6576b87f9c-6s99s\" (UID: \"49cc56f5-b9bb-4694-8965-0e7c6a4aaae6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.056760 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b54a115-ed61-47a7-b447-400aa4f75b1b-config\") pod \"machine-approver-56656f9798-c7s9c\" (UID: \"4b54a115-ed61-47a7-b447-400aa4f75b1b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c7s9c" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.056923 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxncn\" (UniqueName: \"kubernetes.io/projected/6e167ba8-a633-42df-963a-913ba4fe20bf-kube-api-access-kxncn\") pod \"console-f9d7485db-7pmpw\" (UID: \"6e167ba8-a633-42df-963a-913ba4fe20bf\") " pod="openshift-console/console-f9d7485db-7pmpw" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.057296 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.058445 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2b9c93c0-005e-4b54-a498-a4ae8418f839-audit-policies\") pod \"apiserver-7bbb656c7d-cqkjk\" (UID: \"2b9c93c0-005e-4b54-a498-a4ae8418f839\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.059163 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b9c93c0-005e-4b54-a498-a4ae8418f839-serving-cert\") pod \"apiserver-7bbb656c7d-cqkjk\" (UID: \"2b9c93c0-005e-4b54-a498-a4ae8418f839\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.073667 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/17320021-32dc-4bef-befa-fa0a7c2b8533-audit-policies\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.073887 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.073999 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n8z2\" (UniqueName: \"kubernetes.io/projected/17320021-32dc-4bef-befa-fa0a7c2b8533-kube-api-access-2n8z2\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.074130 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b9c93c0-005e-4b54-a498-a4ae8418f839-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-cqkjk\" (UID: \"2b9c93c0-005e-4b54-a498-a4ae8418f839\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.074234 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddvj2\" (UniqueName: \"kubernetes.io/projected/0d6182fe-6ca6-4384-99ce-501079ac58ad-kube-api-access-ddvj2\") pod \"cluster-image-registry-operator-dc59b4c8b-gptmb\" (UID: \"0d6182fe-6ca6-4384-99ce-501079ac58ad\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gptmb" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.074363 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e167ba8-a633-42df-963a-913ba4fe20bf-trusted-ca-bundle\") pod \"console-f9d7485db-7pmpw\" (UID: \"6e167ba8-a633-42df-963a-913ba4fe20bf\") " pod="openshift-console/console-f9d7485db-7pmpw" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.074479 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6sq2\" (UniqueName: \"kubernetes.io/projected/49cc56f5-b9bb-4694-8965-0e7c6a4aaae6-kube-api-access-l6sq2\") pod \"route-controller-manager-6576b87f9c-6s99s\" (UID: \"49cc56f5-b9bb-4694-8965-0e7c6a4aaae6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.074638 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d6182fe-6ca6-4384-99ce-501079ac58ad-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gptmb\" (UID: \"0d6182fe-6ca6-4384-99ce-501079ac58ad\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gptmb" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.074761 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e167ba8-a633-42df-963a-913ba4fe20bf-console-oauth-config\") pod \"console-f9d7485db-7pmpw\" (UID: \"6e167ba8-a633-42df-963a-913ba4fe20bf\") " pod="openshift-console/console-f9d7485db-7pmpw" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.074890 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b9c93c0-005e-4b54-a498-a4ae8418f839-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-cqkjk\" (UID: \"2b9c93c0-005e-4b54-a498-a4ae8418f839\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.067278 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4h6ww" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.074915 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4b54a115-ed61-47a7-b447-400aa4f75b1b-machine-approver-tls\") pod \"machine-approver-56656f9798-c7s9c\" (UID: \"4b54a115-ed61-47a7-b447-400aa4f75b1b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c7s9c" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.075165 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0d6182fe-6ca6-4384-99ce-501079ac58ad-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gptmb\" (UID: \"0d6182fe-6ca6-4384-99ce-501079ac58ad\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gptmb" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.075279 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-image-import-ca\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.075391 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d60f2c5d-3a97-47a6-a311-dffb74233746-config\") pod \"authentication-operator-69f744f599-pjf9j\" (UID: \"d60f2c5d-3a97-47a6-a311-dffb74233746\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjf9j" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.067317 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49cc56f5-b9bb-4694-8965-0e7c6a4aaae6-config\") pod \"route-controller-manager-6576b87f9c-6s99s\" (UID: \"49cc56f5-b9bb-4694-8965-0e7c6a4aaae6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.061962 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e167ba8-a633-42df-963a-913ba4fe20bf-console-serving-cert\") pod \"console-f9d7485db-7pmpw\" (UID: \"6e167ba8-a633-42df-963a-913ba4fe20bf\") " pod="openshift-console/console-f9d7485db-7pmpw" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.075649 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6qm8\" (UniqueName: \"kubernetes.io/projected/4b54a115-ed61-47a7-b447-400aa4f75b1b-kube-api-access-r6qm8\") pod \"machine-approver-56656f9798-c7s9c\" (UID: \"4b54a115-ed61-47a7-b447-400aa4f75b1b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c7s9c" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.075760 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2b9c93c0-005e-4b54-a498-a4ae8418f839-etcd-client\") pod \"apiserver-7bbb656c7d-cqkjk\" (UID: \"2b9c93c0-005e-4b54-a498-a4ae8418f839\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.075903 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.076019 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d60f2c5d-3a97-47a6-a311-dffb74233746-service-ca-bundle\") pod \"authentication-operator-69f744f599-pjf9j\" (UID: \"d60f2c5d-3a97-47a6-a311-dffb74233746\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjf9j" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.076117 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/17320021-32dc-4bef-befa-fa0a7c2b8533-audit-dir\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.076215 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4b54a115-ed61-47a7-b447-400aa4f75b1b-auth-proxy-config\") pod \"machine-approver-56656f9798-c7s9c\" (UID: \"4b54a115-ed61-47a7-b447-400aa4f75b1b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c7s9c" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.076314 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-audit\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.076408 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7b416e1d-da7d-4da7-9bae-210c815d4cf1-images\") pod \"machine-api-operator-5694c8668f-fz52t\" (UID: \"7b416e1d-da7d-4da7-9bae-210c815d4cf1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fz52t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.076508 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.076607 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-audit-dir\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.076697 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e167ba8-a633-42df-963a-913ba4fe20bf-oauth-serving-cert\") pod \"console-f9d7485db-7pmpw\" (UID: \"6e167ba8-a633-42df-963a-913ba4fe20bf\") " pod="openshift-console/console-f9d7485db-7pmpw" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.076819 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clsmr\" (UniqueName: \"kubernetes.io/projected/6a92f740-f168-4d3b-b225-a73109091d7d-kube-api-access-clsmr\") pod \"cluster-samples-operator-665b6dd947-h7k8v\" (UID: \"6a92f740-f168-4d3b-b225-a73109091d7d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h7k8v" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.076933 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-node-pullsecrets\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.077039 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nsjv\" (UniqueName: \"kubernetes.io/projected/fe8e9530-3977-4dc5-abe0-f8c655b58f6a-kube-api-access-4nsjv\") pod \"downloads-7954f5f757-gbzgh\" (UID: \"fe8e9530-3977-4dc5-abe0-f8c655b58f6a\") " pod="openshift-console/downloads-7954f5f757-gbzgh" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.077131 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.077242 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2b9c93c0-005e-4b54-a498-a4ae8418f839-audit-dir\") pod \"apiserver-7bbb656c7d-cqkjk\" (UID: \"2b9c93c0-005e-4b54-a498-a4ae8418f839\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.077338 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4h6l\" (UniqueName: \"kubernetes.io/projected/42554b00-c5ca-41d5-b84e-af36e56239c6-kube-api-access-l4h6l\") pod \"openshift-config-operator-7777fb866f-7q5r8\" (UID: \"42554b00-c5ca-41d5-b84e-af36e56239c6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7q5r8" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.077445 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ce56dcb-a916-41ca-b706-df5e157576eb-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-gc5rd\" (UID: \"4ce56dcb-a916-41ca-b706-df5e157576eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.077539 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txnfx\" (UniqueName: \"kubernetes.io/projected/7b416e1d-da7d-4da7-9bae-210c815d4cf1-kube-api-access-txnfx\") pod \"machine-api-operator-5694c8668f-fz52t\" (UID: \"7b416e1d-da7d-4da7-9bae-210c815d4cf1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fz52t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.077638 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ce56dcb-a916-41ca-b706-df5e157576eb-serving-cert\") pod \"controller-manager-879f6c89f-gc5rd\" (UID: \"4ce56dcb-a916-41ca-b706-df5e157576eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.077731 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.077844 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ce56dcb-a916-41ca-b706-df5e157576eb-config\") pod \"controller-manager-879f6c89f-gc5rd\" (UID: \"4ce56dcb-a916-41ca-b706-df5e157576eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.077959 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lk4n\" (UniqueName: \"kubernetes.io/projected/2b9c93c0-005e-4b54-a498-a4ae8418f839-kube-api-access-9lk4n\") pod \"apiserver-7bbb656c7d-cqkjk\" (UID: \"2b9c93c0-005e-4b54-a498-a4ae8418f839\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.078060 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-encryption-config\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.078166 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.078510 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d60f2c5d-3a97-47a6-a311-dffb74233746-service-ca-bundle\") pod \"authentication-operator-69f744f599-pjf9j\" (UID: \"d60f2c5d-3a97-47a6-a311-dffb74233746\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjf9j" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.067238 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401410-8g42t"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.066045 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-etcd-serving-ca\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.079294 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vtk75"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.064585 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/42554b00-c5ca-41d5-b84e-af36e56239c6-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7q5r8\" (UID: \"42554b00-c5ca-41d5-b84e-af36e56239c6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7q5r8" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.064111 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d60f2c5d-3a97-47a6-a311-dffb74233746-serving-cert\") pod \"authentication-operator-69f744f599-pjf9j\" (UID: \"d60f2c5d-3a97-47a6-a311-dffb74233746\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjf9j" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.084990 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-audit\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.084997 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.085742 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.085872 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49cc56f5-b9bb-4694-8965-0e7c6a4aaae6-serving-cert\") pod \"route-controller-manager-6576b87f9c-6s99s\" (UID: \"49cc56f5-b9bb-4694-8965-0e7c6a4aaae6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.085916 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.086065 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23e9d749-57d6-4ce1-a899-44b745738978-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zvh6p\" (UID: \"23e9d749-57d6-4ce1-a899-44b745738978\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zvh6p" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.086108 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23e9d749-57d6-4ce1-a899-44b745738978-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zvh6p\" (UID: \"23e9d749-57d6-4ce1-a899-44b745738978\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zvh6p" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.086141 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7b416e1d-da7d-4da7-9bae-210c815d4cf1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fz52t\" (UID: \"7b416e1d-da7d-4da7-9bae-210c815d4cf1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fz52t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.086174 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.086761 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e167ba8-a633-42df-963a-913ba4fe20bf-trusted-ca-bundle\") pod \"console-f9d7485db-7pmpw\" (UID: \"6e167ba8-a633-42df-963a-913ba4fe20bf\") " pod="openshift-console/console-f9d7485db-7pmpw" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.087613 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b9c93c0-005e-4b54-a498-a4ae8418f839-serving-cert\") pod \"apiserver-7bbb656c7d-cqkjk\" (UID: \"2b9c93c0-005e-4b54-a498-a4ae8418f839\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.087706 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23e9d749-57d6-4ce1-a899-44b745738978-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zvh6p\" (UID: \"23e9d749-57d6-4ce1-a899-44b745738978\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zvh6p" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.087895 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-audit-dir\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.088895 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e167ba8-a633-42df-963a-913ba4fe20bf-oauth-serving-cert\") pod \"console-f9d7485db-7pmpw\" (UID: \"6e167ba8-a633-42df-963a-913ba4fe20bf\") " pod="openshift-console/console-f9d7485db-7pmpw" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.089252 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.089140 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-etcd-client\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.089067 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.066297 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ce56dcb-a916-41ca-b706-df5e157576eb-client-ca\") pod \"controller-manager-879f6c89f-gc5rd\" (UID: \"4ce56dcb-a916-41ca-b706-df5e157576eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.089973 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-node-pullsecrets\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.090173 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d60f2c5d-3a97-47a6-a311-dffb74233746-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pjf9j\" (UID: \"d60f2c5d-3a97-47a6-a311-dffb74233746\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjf9j" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.063164 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2b9c93c0-005e-4b54-a498-a4ae8418f839-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-cqkjk\" (UID: \"2b9c93c0-005e-4b54-a498-a4ae8418f839\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.090951 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.092160 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ce56dcb-a916-41ca-b706-df5e157576eb-config\") pod \"controller-manager-879f6c89f-gc5rd\" (UID: \"4ce56dcb-a916-41ca-b706-df5e157576eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.093767 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ce56dcb-a916-41ca-b706-df5e157576eb-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-gc5rd\" (UID: \"4ce56dcb-a916-41ca-b706-df5e157576eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.095299 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-image-import-ca\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.067943 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49cc56f5-b9bb-4694-8965-0e7c6a4aaae6-client-ca\") pod \"route-controller-manager-6576b87f9c-6s99s\" (UID: \"49cc56f5-b9bb-4694-8965-0e7c6a4aaae6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.096208 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23e9d749-57d6-4ce1-a899-44b745738978-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zvh6p\" (UID: \"23e9d749-57d6-4ce1-a899-44b745738978\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zvh6p" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.098728 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0d6182fe-6ca6-4384-99ce-501079ac58ad-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gptmb\" (UID: \"0d6182fe-6ca6-4384-99ce-501079ac58ad\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gptmb" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.060407 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.099550 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a92f740-f168-4d3b-b225-a73109091d7d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-h7k8v\" (UID: \"6a92f740-f168-4d3b-b225-a73109091d7d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h7k8v" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.065215 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.067568 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.068238 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.068520 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.068565 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.066664 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kklq9" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.068620 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.074485 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/17320021-32dc-4bef-befa-fa0a7c2b8533-audit-policies\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.102893 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d60f2c5d-3a97-47a6-a311-dffb74233746-config\") pod \"authentication-operator-69f744f599-pjf9j\" (UID: \"d60f2c5d-3a97-47a6-a311-dffb74233746\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjf9j" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.061356 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e167ba8-a633-42df-963a-913ba4fe20bf-console-config\") pod \"console-f9d7485db-7pmpw\" (UID: \"6e167ba8-a633-42df-963a-913ba4fe20bf\") " pod="openshift-console/console-f9d7485db-7pmpw" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.104011 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.104042 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-8g42t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.101952 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ce56dcb-a916-41ca-b706-df5e157576eb-serving-cert\") pod \"controller-manager-879f6c89f-gc5rd\" (UID: \"4ce56dcb-a916-41ca-b706-df5e157576eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.104246 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2b9c93c0-005e-4b54-a498-a4ae8418f839-audit-dir\") pod \"apiserver-7bbb656c7d-cqkjk\" (UID: \"2b9c93c0-005e-4b54-a498-a4ae8418f839\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.104511 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.105909 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2b9c93c0-005e-4b54-a498-a4ae8418f839-encryption-config\") pod \"apiserver-7bbb656c7d-cqkjk\" (UID: \"2b9c93c0-005e-4b54-a498-a4ae8418f839\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.071228 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.104193 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0d6182fe-6ca6-4384-99ce-501079ac58ad-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gptmb\" (UID: \"0d6182fe-6ca6-4384-99ce-501079ac58ad\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gptmb" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.073526 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.107915 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-serving-cert\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.075459 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.077392 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.079258 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/17320021-32dc-4bef-befa-fa0a7c2b8533-audit-dir\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.108384 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2b9c93c0-005e-4b54-a498-a4ae8418f839-etcd-client\") pod \"apiserver-7bbb656c7d-cqkjk\" (UID: \"2b9c93c0-005e-4b54-a498-a4ae8418f839\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.105216 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.109430 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.111383 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.113606 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.114885 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-encryption-config\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.115627 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4b54a115-ed61-47a7-b447-400aa4f75b1b-machine-approver-tls\") pod \"machine-approver-56656f9798-c7s9c\" (UID: \"4b54a115-ed61-47a7-b447-400aa4f75b1b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c7s9c" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.115685 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.115728 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9l555"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.116033 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vtk75" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.116677 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4b54a115-ed61-47a7-b447-400aa4f75b1b-auth-proxy-config\") pod \"machine-approver-56656f9798-c7s9c\" (UID: \"4b54a115-ed61-47a7-b447-400aa4f75b1b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c7s9c" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.118121 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7h6hx"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.118189 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e167ba8-a633-42df-963a-913ba4fe20bf-console-oauth-config\") pod \"console-f9d7485db-7pmpw\" (UID: \"6e167ba8-a633-42df-963a-913ba4fe20bf\") " pod="openshift-console/console-f9d7485db-7pmpw" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.118619 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9l555" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.120846 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8df8d"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.121348 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8hvj9"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.121527 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7h6hx" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.121545 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8df8d" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.122186 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vsp9t"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.122649 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-jrgmq"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.123143 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tr85d"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.123327 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vsp9t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.123339 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-jrgmq" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.123662 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8hvj9" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.124121 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-7pmpw"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.124143 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-z692d"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.124156 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.124169 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xw9w7"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.124271 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tr85d" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.124778 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gc5rd"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.124869 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xw9w7" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.125850 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-9whr7"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.126365 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9whr7" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.126911 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42554b00-c5ca-41d5-b84e-af36e56239c6-serving-cert\") pod \"openshift-config-operator-7777fb866f-7q5r8\" (UID: \"42554b00-c5ca-41d5-b84e-af36e56239c6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7q5r8" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.128642 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-mc6bz"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.129708 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mc6bz" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.132881 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r24mh"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.133640 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5llzt"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.133729 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r24mh" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.133897 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.134221 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-gbzgh"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.135215 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.136030 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-nd74n"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.137312 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.139519 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wj7zx"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.141312 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qxl8w"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.143223 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qb7gf"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.144736 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gptmb"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.146954 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mm4pt"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.148257 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401410-8g42t"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.150273 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9l555"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.150596 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pjf9j"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.154027 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d8lwt"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.154560 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.154776 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gff7t"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.155591 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tbhnz"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.160311 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4h6ww"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.162201 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h7k8v"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.163268 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-p5r8t"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.165033 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-b697g"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.166686 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-b697g" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.167238 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.172125 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vtk75"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.173223 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mc6bz"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.174692 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xqbjw"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.176195 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8df8d"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.179925 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vsp9t"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.179999 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-jrgmq"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.181176 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7h6hx"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.182055 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kklq9"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.182178 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.184064 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tr85d"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.184974 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-rxdvg"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.186380 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xw9w7"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.186493 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-rxdvg" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.187162 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd45c23d-4eaf-40d9-a735-eb804c875a59-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gff7t\" (UID: \"dd45c23d-4eaf-40d9-a735-eb804c875a59\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gff7t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.187199 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebac8a19-3d09-41ed-99c0-061a54e7e6ec-serving-cert\") pod \"etcd-operator-b45778765-p5r8t\" (UID: \"ebac8a19-3d09-41ed-99c0-061a54e7e6ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5r8t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.187219 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ebac8a19-3d09-41ed-99c0-061a54e7e6ec-etcd-ca\") pod \"etcd-operator-b45778765-p5r8t\" (UID: \"ebac8a19-3d09-41ed-99c0-061a54e7e6ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5r8t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.187234 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dd45c23d-4eaf-40d9-a735-eb804c875a59-metrics-tls\") pod \"ingress-operator-5b745b69d9-gff7t\" (UID: \"dd45c23d-4eaf-40d9-a735-eb804c875a59\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gff7t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.187251 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5250fdec-a063-483a-9bd2-b4e11479c232-trusted-ca\") pod \"console-operator-58897d9998-nd74n\" (UID: \"5250fdec-a063-483a-9bd2-b4e11479c232\") " pod="openshift-console-operator/console-operator-58897d9998-nd74n" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.187287 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5250fdec-a063-483a-9bd2-b4e11479c232-config\") pod \"console-operator-58897d9998-nd74n\" (UID: \"5250fdec-a063-483a-9bd2-b4e11479c232\") " pod="openshift-console-operator/console-operator-58897d9998-nd74n" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.187313 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ebac8a19-3d09-41ed-99c0-061a54e7e6ec-etcd-service-ca\") pod \"etcd-operator-b45778765-p5r8t\" (UID: \"ebac8a19-3d09-41ed-99c0-061a54e7e6ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5r8t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.187343 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebac8a19-3d09-41ed-99c0-061a54e7e6ec-config\") pod \"etcd-operator-b45778765-p5r8t\" (UID: \"ebac8a19-3d09-41ed-99c0-061a54e7e6ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5r8t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.187687 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t94w6\" (UniqueName: \"kubernetes.io/projected/2b2cf6a0-7242-42e1-a1a2-2c76cfc7bcd9-kube-api-access-t94w6\") pod \"olm-operator-6b444d44fb-d8lwt\" (UID: \"2b2cf6a0-7242-42e1-a1a2-2c76cfc7bcd9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d8lwt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.187803 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2b2cf6a0-7242-42e1-a1a2-2c76cfc7bcd9-srv-cert\") pod \"olm-operator-6b444d44fb-d8lwt\" (UID: \"2b2cf6a0-7242-42e1-a1a2-2c76cfc7bcd9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d8lwt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.187852 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r24mh"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.187858 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2b2cf6a0-7242-42e1-a1a2-2c76cfc7bcd9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-d8lwt\" (UID: \"2b2cf6a0-7242-42e1-a1a2-2c76cfc7bcd9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d8lwt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.187897 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvt2w\" (UniqueName: \"kubernetes.io/projected/dd45c23d-4eaf-40d9-a735-eb804c875a59-kube-api-access-pvt2w\") pod \"ingress-operator-5b745b69d9-gff7t\" (UID: \"dd45c23d-4eaf-40d9-a735-eb804c875a59\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gff7t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.187932 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wklfx\" (UniqueName: \"kubernetes.io/projected/0ed8b5be-28a8-4dcf-ad65-16d392570684-kube-api-access-wklfx\") pod \"catalog-operator-68c6474976-tbhnz\" (UID: \"0ed8b5be-28a8-4dcf-ad65-16d392570684\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tbhnz" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.188013 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ebac8a19-3d09-41ed-99c0-061a54e7e6ec-etcd-client\") pod \"etcd-operator-b45778765-p5r8t\" (UID: \"ebac8a19-3d09-41ed-99c0-061a54e7e6ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5r8t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.188034 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd45c23d-4eaf-40d9-a735-eb804c875a59-trusted-ca\") pod \"ingress-operator-5b745b69d9-gff7t\" (UID: \"dd45c23d-4eaf-40d9-a735-eb804c875a59\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gff7t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.192167 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-b697g"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.194279 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8hvj9"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.194237 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.196234 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qxcj2"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.197772 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qxcj2"] Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.197977 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qxcj2" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.200166 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cgjh\" (UniqueName: \"kubernetes.io/projected/5250fdec-a063-483a-9bd2-b4e11479c232-kube-api-access-4cgjh\") pod \"console-operator-58897d9998-nd74n\" (UID: \"5250fdec-a063-483a-9bd2-b4e11479c232\") " pod="openshift-console-operator/console-operator-58897d9998-nd74n" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.200451 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fff4a5ac-b41a-4c64-b448-5a687e16e9cd-metrics-tls\") pod \"dns-operator-744455d44c-qxl8w\" (UID: \"fff4a5ac-b41a-4c64-b448-5a687e16e9cd\") " pod="openshift-dns-operator/dns-operator-744455d44c-qxl8w" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.200585 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsm5f\" (UniqueName: \"kubernetes.io/projected/fff4a5ac-b41a-4c64-b448-5a687e16e9cd-kube-api-access-nsm5f\") pod \"dns-operator-744455d44c-qxl8w\" (UID: \"fff4a5ac-b41a-4c64-b448-5a687e16e9cd\") " pod="openshift-dns-operator/dns-operator-744455d44c-qxl8w" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.200697 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0ed8b5be-28a8-4dcf-ad65-16d392570684-profile-collector-cert\") pod \"catalog-operator-68c6474976-tbhnz\" (UID: \"0ed8b5be-28a8-4dcf-ad65-16d392570684\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tbhnz" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.201076 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5250fdec-a063-483a-9bd2-b4e11479c232-serving-cert\") pod \"console-operator-58897d9998-nd74n\" (UID: \"5250fdec-a063-483a-9bd2-b4e11479c232\") " pod="openshift-console-operator/console-operator-58897d9998-nd74n" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.201257 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0ed8b5be-28a8-4dcf-ad65-16d392570684-srv-cert\") pod \"catalog-operator-68c6474976-tbhnz\" (UID: \"0ed8b5be-28a8-4dcf-ad65-16d392570684\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tbhnz" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.201317 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dph8\" (UniqueName: \"kubernetes.io/projected/ebac8a19-3d09-41ed-99c0-061a54e7e6ec-kube-api-access-7dph8\") pod \"etcd-operator-b45778765-p5r8t\" (UID: \"ebac8a19-3d09-41ed-99c0-061a54e7e6ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5r8t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.214008 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.235037 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.255701 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.275045 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.293983 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.302621 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5250fdec-a063-483a-9bd2-b4e11479c232-config\") pod \"console-operator-58897d9998-nd74n\" (UID: \"5250fdec-a063-483a-9bd2-b4e11479c232\") " pod="openshift-console-operator/console-operator-58897d9998-nd74n" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.302674 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ebac8a19-3d09-41ed-99c0-061a54e7e6ec-etcd-service-ca\") pod \"etcd-operator-b45778765-p5r8t\" (UID: \"ebac8a19-3d09-41ed-99c0-061a54e7e6ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5r8t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.302711 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebac8a19-3d09-41ed-99c0-061a54e7e6ec-config\") pod \"etcd-operator-b45778765-p5r8t\" (UID: \"ebac8a19-3d09-41ed-99c0-061a54e7e6ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5r8t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.303524 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebac8a19-3d09-41ed-99c0-061a54e7e6ec-config\") pod \"etcd-operator-b45778765-p5r8t\" (UID: \"ebac8a19-3d09-41ed-99c0-061a54e7e6ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5r8t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.303617 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5250fdec-a063-483a-9bd2-b4e11479c232-config\") pod \"console-operator-58897d9998-nd74n\" (UID: \"5250fdec-a063-483a-9bd2-b4e11479c232\") " pod="openshift-console-operator/console-operator-58897d9998-nd74n" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.303731 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t94w6\" (UniqueName: \"kubernetes.io/projected/2b2cf6a0-7242-42e1-a1a2-2c76cfc7bcd9-kube-api-access-t94w6\") pod \"olm-operator-6b444d44fb-d8lwt\" (UID: \"2b2cf6a0-7242-42e1-a1a2-2c76cfc7bcd9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d8lwt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.303778 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2b2cf6a0-7242-42e1-a1a2-2c76cfc7bcd9-srv-cert\") pod \"olm-operator-6b444d44fb-d8lwt\" (UID: \"2b2cf6a0-7242-42e1-a1a2-2c76cfc7bcd9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d8lwt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.303974 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2b2cf6a0-7242-42e1-a1a2-2c76cfc7bcd9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-d8lwt\" (UID: \"2b2cf6a0-7242-42e1-a1a2-2c76cfc7bcd9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d8lwt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.304000 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ebac8a19-3d09-41ed-99c0-061a54e7e6ec-etcd-service-ca\") pod \"etcd-operator-b45778765-p5r8t\" (UID: \"ebac8a19-3d09-41ed-99c0-061a54e7e6ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5r8t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.304015 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvt2w\" (UniqueName: \"kubernetes.io/projected/dd45c23d-4eaf-40d9-a735-eb804c875a59-kube-api-access-pvt2w\") pod \"ingress-operator-5b745b69d9-gff7t\" (UID: \"dd45c23d-4eaf-40d9-a735-eb804c875a59\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gff7t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.304046 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wklfx\" (UniqueName: \"kubernetes.io/projected/0ed8b5be-28a8-4dcf-ad65-16d392570684-kube-api-access-wklfx\") pod \"catalog-operator-68c6474976-tbhnz\" (UID: \"0ed8b5be-28a8-4dcf-ad65-16d392570684\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tbhnz" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.304096 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ebac8a19-3d09-41ed-99c0-061a54e7e6ec-etcd-client\") pod \"etcd-operator-b45778765-p5r8t\" (UID: \"ebac8a19-3d09-41ed-99c0-061a54e7e6ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5r8t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.304120 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd45c23d-4eaf-40d9-a735-eb804c875a59-trusted-ca\") pod \"ingress-operator-5b745b69d9-gff7t\" (UID: \"dd45c23d-4eaf-40d9-a735-eb804c875a59\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gff7t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.304145 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cgjh\" (UniqueName: \"kubernetes.io/projected/5250fdec-a063-483a-9bd2-b4e11479c232-kube-api-access-4cgjh\") pod \"console-operator-58897d9998-nd74n\" (UID: \"5250fdec-a063-483a-9bd2-b4e11479c232\") " pod="openshift-console-operator/console-operator-58897d9998-nd74n" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.304214 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fff4a5ac-b41a-4c64-b448-5a687e16e9cd-metrics-tls\") pod \"dns-operator-744455d44c-qxl8w\" (UID: \"fff4a5ac-b41a-4c64-b448-5a687e16e9cd\") " pod="openshift-dns-operator/dns-operator-744455d44c-qxl8w" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.304238 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsm5f\" (UniqueName: \"kubernetes.io/projected/fff4a5ac-b41a-4c64-b448-5a687e16e9cd-kube-api-access-nsm5f\") pod \"dns-operator-744455d44c-qxl8w\" (UID: \"fff4a5ac-b41a-4c64-b448-5a687e16e9cd\") " pod="openshift-dns-operator/dns-operator-744455d44c-qxl8w" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.304260 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0ed8b5be-28a8-4dcf-ad65-16d392570684-profile-collector-cert\") pod \"catalog-operator-68c6474976-tbhnz\" (UID: \"0ed8b5be-28a8-4dcf-ad65-16d392570684\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tbhnz" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.304293 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5250fdec-a063-483a-9bd2-b4e11479c232-serving-cert\") pod \"console-operator-58897d9998-nd74n\" (UID: \"5250fdec-a063-483a-9bd2-b4e11479c232\") " pod="openshift-console-operator/console-operator-58897d9998-nd74n" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.304326 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0ed8b5be-28a8-4dcf-ad65-16d392570684-srv-cert\") pod \"catalog-operator-68c6474976-tbhnz\" (UID: \"0ed8b5be-28a8-4dcf-ad65-16d392570684\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tbhnz" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.304356 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dph8\" (UniqueName: \"kubernetes.io/projected/ebac8a19-3d09-41ed-99c0-061a54e7e6ec-kube-api-access-7dph8\") pod \"etcd-operator-b45778765-p5r8t\" (UID: \"ebac8a19-3d09-41ed-99c0-061a54e7e6ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5r8t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.305012 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd45c23d-4eaf-40d9-a735-eb804c875a59-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gff7t\" (UID: \"dd45c23d-4eaf-40d9-a735-eb804c875a59\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gff7t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.305062 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebac8a19-3d09-41ed-99c0-061a54e7e6ec-serving-cert\") pod \"etcd-operator-b45778765-p5r8t\" (UID: \"ebac8a19-3d09-41ed-99c0-061a54e7e6ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5r8t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.305091 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ebac8a19-3d09-41ed-99c0-061a54e7e6ec-etcd-ca\") pod \"etcd-operator-b45778765-p5r8t\" (UID: \"ebac8a19-3d09-41ed-99c0-061a54e7e6ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5r8t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.305117 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dd45c23d-4eaf-40d9-a735-eb804c875a59-metrics-tls\") pod \"ingress-operator-5b745b69d9-gff7t\" (UID: \"dd45c23d-4eaf-40d9-a735-eb804c875a59\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gff7t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.305213 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5250fdec-a063-483a-9bd2-b4e11479c232-trusted-ca\") pod \"console-operator-58897d9998-nd74n\" (UID: \"5250fdec-a063-483a-9bd2-b4e11479c232\") " pod="openshift-console-operator/console-operator-58897d9998-nd74n" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.305646 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd45c23d-4eaf-40d9-a735-eb804c875a59-trusted-ca\") pod \"ingress-operator-5b745b69d9-gff7t\" (UID: \"dd45c23d-4eaf-40d9-a735-eb804c875a59\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gff7t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.305833 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ebac8a19-3d09-41ed-99c0-061a54e7e6ec-etcd-ca\") pod \"etcd-operator-b45778765-p5r8t\" (UID: \"ebac8a19-3d09-41ed-99c0-061a54e7e6ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5r8t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.306174 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5250fdec-a063-483a-9bd2-b4e11479c232-trusted-ca\") pod \"console-operator-58897d9998-nd74n\" (UID: \"5250fdec-a063-483a-9bd2-b4e11479c232\") " pod="openshift-console-operator/console-operator-58897d9998-nd74n" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.308042 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fff4a5ac-b41a-4c64-b448-5a687e16e9cd-metrics-tls\") pod \"dns-operator-744455d44c-qxl8w\" (UID: \"fff4a5ac-b41a-4c64-b448-5a687e16e9cd\") " pod="openshift-dns-operator/dns-operator-744455d44c-qxl8w" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.308725 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5250fdec-a063-483a-9bd2-b4e11479c232-serving-cert\") pod \"console-operator-58897d9998-nd74n\" (UID: \"5250fdec-a063-483a-9bd2-b4e11479c232\") " pod="openshift-console-operator/console-operator-58897d9998-nd74n" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.309173 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dd45c23d-4eaf-40d9-a735-eb804c875a59-metrics-tls\") pod \"ingress-operator-5b745b69d9-gff7t\" (UID: \"dd45c23d-4eaf-40d9-a735-eb804c875a59\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gff7t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.315339 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.318678 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebac8a19-3d09-41ed-99c0-061a54e7e6ec-serving-cert\") pod \"etcd-operator-b45778765-p5r8t\" (UID: \"ebac8a19-3d09-41ed-99c0-061a54e7e6ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5r8t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.334305 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.337479 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ebac8a19-3d09-41ed-99c0-061a54e7e6ec-etcd-client\") pod \"etcd-operator-b45778765-p5r8t\" (UID: \"ebac8a19-3d09-41ed-99c0-061a54e7e6ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5r8t" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.355301 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.374358 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.395493 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.414552 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.419239 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0ed8b5be-28a8-4dcf-ad65-16d392570684-srv-cert\") pod \"catalog-operator-68c6474976-tbhnz\" (UID: \"0ed8b5be-28a8-4dcf-ad65-16d392570684\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tbhnz" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.434680 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.438860 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2b2cf6a0-7242-42e1-a1a2-2c76cfc7bcd9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-d8lwt\" (UID: \"2b2cf6a0-7242-42e1-a1a2-2c76cfc7bcd9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d8lwt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.438860 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0ed8b5be-28a8-4dcf-ad65-16d392570684-profile-collector-cert\") pod \"catalog-operator-68c6474976-tbhnz\" (UID: \"0ed8b5be-28a8-4dcf-ad65-16d392570684\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tbhnz" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.455294 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.473724 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.494233 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.514185 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.517417 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2b2cf6a0-7242-42e1-a1a2-2c76cfc7bcd9-srv-cert\") pod \"olm-operator-6b444d44fb-d8lwt\" (UID: \"2b2cf6a0-7242-42e1-a1a2-2c76cfc7bcd9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d8lwt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.569104 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drjkd\" (UniqueName: \"kubernetes.io/projected/5d946af5-4a4e-476e-ad32-3eae6ad6c8f7-kube-api-access-drjkd\") pod \"apiserver-76f77b778f-z692d\" (UID: \"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7\") " pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.589148 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hb67\" (UniqueName: \"kubernetes.io/projected/23e9d749-57d6-4ce1-a899-44b745738978-kube-api-access-4hb67\") pod \"openshift-apiserver-operator-796bbdcf4f-zvh6p\" (UID: \"23e9d749-57d6-4ce1-a899-44b745738978\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zvh6p" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.629215 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbzd6\" (UniqueName: \"kubernetes.io/projected/4ce56dcb-a916-41ca-b706-df5e157576eb-kube-api-access-bbzd6\") pod \"controller-manager-879f6c89f-gc5rd\" (UID: \"4ce56dcb-a916-41ca-b706-df5e157576eb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.648667 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzzsp\" (UniqueName: \"kubernetes.io/projected/d60f2c5d-3a97-47a6-a311-dffb74233746-kube-api-access-wzzsp\") pod \"authentication-operator-69f744f599-pjf9j\" (UID: \"d60f2c5d-3a97-47a6-a311-dffb74233746\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pjf9j" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.669952 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxncn\" (UniqueName: \"kubernetes.io/projected/6e167ba8-a633-42df-963a-913ba4fe20bf-kube-api-access-kxncn\") pod \"console-f9d7485db-7pmpw\" (UID: \"6e167ba8-a633-42df-963a-913ba4fe20bf\") " pod="openshift-console/console-f9d7485db-7pmpw" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.674397 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.708534 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n8z2\" (UniqueName: \"kubernetes.io/projected/17320021-32dc-4bef-befa-fa0a7c2b8533-kube-api-access-2n8z2\") pod \"oauth-openshift-558db77b4-5llzt\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.714834 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.734515 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.754898 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7pmpw" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.773619 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.774142 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.776013 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddvj2\" (UniqueName: \"kubernetes.io/projected/0d6182fe-6ca6-4384-99ce-501079ac58ad-kube-api-access-ddvj2\") pod \"cluster-image-registry-operator-dc59b4c8b-gptmb\" (UID: \"0d6182fe-6ca6-4384-99ce-501079ac58ad\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gptmb" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.794868 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.807804 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zvh6p" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.824173 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.832672 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clsmr\" (UniqueName: \"kubernetes.io/projected/6a92f740-f168-4d3b-b225-a73109091d7d-kube-api-access-clsmr\") pod \"cluster-samples-operator-665b6dd947-h7k8v\" (UID: \"6a92f740-f168-4d3b-b225-a73109091d7d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h7k8v" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.837633 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.865117 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6sq2\" (UniqueName: \"kubernetes.io/projected/49cc56f5-b9bb-4694-8965-0e7c6a4aaae6-kube-api-access-l6sq2\") pod \"route-controller-manager-6576b87f9c-6s99s\" (UID: \"49cc56f5-b9bb-4694-8965-0e7c6a4aaae6\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.868685 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.878098 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nsjv\" (UniqueName: \"kubernetes.io/projected/fe8e9530-3977-4dc5-abe0-f8c655b58f6a-kube-api-access-4nsjv\") pod \"downloads-7954f5f757-gbzgh\" (UID: \"fe8e9530-3977-4dc5-abe0-f8c655b58f6a\") " pod="openshift-console/downloads-7954f5f757-gbzgh" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.880581 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h7k8v" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.894956 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lk4n\" (UniqueName: \"kubernetes.io/projected/2b9c93c0-005e-4b54-a498-a4ae8418f839-kube-api-access-9lk4n\") pod \"apiserver-7bbb656c7d-cqkjk\" (UID: \"2b9c93c0-005e-4b54-a498-a4ae8418f839\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.914996 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-gbzgh" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.918305 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4h6l\" (UniqueName: \"kubernetes.io/projected/42554b00-c5ca-41d5-b84e-af36e56239c6-kube-api-access-l4h6l\") pod \"openshift-config-operator-7777fb866f-7q5r8\" (UID: \"42554b00-c5ca-41d5-b84e-af36e56239c6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7q5r8" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.931262 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-pjf9j" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.932101 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d6182fe-6ca6-4384-99ce-501079ac58ad-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gptmb\" (UID: \"0d6182fe-6ca6-4384-99ce-501079ac58ad\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gptmb" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.975090 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.977377 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6qm8\" (UniqueName: \"kubernetes.io/projected/4b54a115-ed61-47a7-b447-400aa4f75b1b-kube-api-access-r6qm8\") pod \"machine-approver-56656f9798-c7s9c\" (UID: \"4b54a115-ed61-47a7-b447-400aa4f75b1b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c7s9c" Nov 25 15:37:33 crc kubenswrapper[4704]: I1125 15:37:33.995164 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.016812 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.036408 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 25 15:37:34 crc kubenswrapper[4704]: E1125 15:37:34.043962 4704 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Nov 25 15:37:34 crc kubenswrapper[4704]: E1125 15:37:34.044152 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7b416e1d-da7d-4da7-9bae-210c815d4cf1-config podName:7b416e1d-da7d-4da7-9bae-210c815d4cf1 nodeName:}" failed. No retries permitted until 2025-11-25 15:37:34.544126161 +0000 UTC m=+140.812399942 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/7b416e1d-da7d-4da7-9bae-210c815d4cf1-config") pod "machine-api-operator-5694c8668f-fz52t" (UID: "7b416e1d-da7d-4da7-9bae-210c815d4cf1") : failed to sync configmap cache: timed out waiting for the condition Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.055452 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.058895 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gptmb" Nov 25 15:37:34 crc kubenswrapper[4704]: E1125 15:37:34.068320 4704 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/machine-approver-config: failed to sync configmap cache: timed out waiting for the condition Nov 25 15:37:34 crc kubenswrapper[4704]: E1125 15:37:34.068423 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4b54a115-ed61-47a7-b447-400aa4f75b1b-config podName:4b54a115-ed61-47a7-b447-400aa4f75b1b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:34.568402694 +0000 UTC m=+140.836676465 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/4b54a115-ed61-47a7-b447-400aa4f75b1b-config") pod "machine-approver-56656f9798-c7s9c" (UID: "4b54a115-ed61-47a7-b447-400aa4f75b1b") : failed to sync configmap cache: timed out waiting for the condition Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.074087 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 25 15:37:34 crc kubenswrapper[4704]: E1125 15:37:34.086420 4704 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Nov 25 15:37:34 crc kubenswrapper[4704]: E1125 15:37:34.086544 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7b416e1d-da7d-4da7-9bae-210c815d4cf1-images podName:7b416e1d-da7d-4da7-9bae-210c815d4cf1 nodeName:}" failed. No retries permitted until 2025-11-25 15:37:34.586515299 +0000 UTC m=+140.854789080 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/7b416e1d-da7d-4da7-9bae-210c815d4cf1-images") pod "machine-api-operator-5694c8668f-fz52t" (UID: "7b416e1d-da7d-4da7-9bae-210c815d4cf1") : failed to sync configmap cache: timed out waiting for the condition Nov 25 15:37:34 crc kubenswrapper[4704]: E1125 15:37:34.088601 4704 secret.go:188] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Nov 25 15:37:34 crc kubenswrapper[4704]: E1125 15:37:34.088667 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b416e1d-da7d-4da7-9bae-210c815d4cf1-machine-api-operator-tls podName:7b416e1d-da7d-4da7-9bae-210c815d4cf1 nodeName:}" failed. No retries permitted until 2025-11-25 15:37:34.588652505 +0000 UTC m=+140.856926286 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/7b416e1d-da7d-4da7-9bae-210c815d4cf1-machine-api-operator-tls") pod "machine-api-operator-5694c8668f-fz52t" (UID: "7b416e1d-da7d-4da7-9bae-210c815d4cf1") : failed to sync secret cache: timed out waiting for the condition Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.094816 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7q5r8" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.095485 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.114669 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.132072 4704 request.go:700] Waited for 1.014166366s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/configmaps?fieldSelector=metadata.name%3Dconfig&limit=500&resourceVersion=0 Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.133900 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.155445 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.156421 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.174482 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.194633 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.215923 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.238314 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.255936 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.276691 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.279317 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zvh6p"] Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.280797 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-7pmpw"] Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.287284 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5llzt"] Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.294190 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.302262 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-gbzgh"] Nov 25 15:37:34 crc kubenswrapper[4704]: W1125 15:37:34.313953 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe8e9530_3977_4dc5_abe0_f8c655b58f6a.slice/crio-ee2babbdde2ee9e479bdd8a20265056d95d78ccf20e7c930ea09ee2c2890e12d WatchSource:0}: Error finding container ee2babbdde2ee9e479bdd8a20265056d95d78ccf20e7c930ea09ee2c2890e12d: Status 404 returned error can't find the container with id ee2babbdde2ee9e479bdd8a20265056d95d78ccf20e7c930ea09ee2c2890e12d Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.314926 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.333991 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.339948 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pjf9j"] Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.353944 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.374803 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7q5r8"] Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.375563 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.383859 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gc5rd"] Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.394775 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.414601 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.433611 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.441514 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-z692d"] Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.442564 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s"] Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.445573 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h7k8v"] Nov 25 15:37:34 crc kubenswrapper[4704]: W1125 15:37:34.447584 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd60f2c5d_3a97_47a6_a311_dffb74233746.slice/crio-ba061b74330a28b102ed87f04ce3c540e202ecd469387ab901ac9252793ffc44 WatchSource:0}: Error finding container ba061b74330a28b102ed87f04ce3c540e202ecd469387ab901ac9252793ffc44: Status 404 returned error can't find the container with id ba061b74330a28b102ed87f04ce3c540e202ecd469387ab901ac9252793ffc44 Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.454343 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 25 15:37:34 crc kubenswrapper[4704]: W1125 15:37:34.459932 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ce56dcb_a916_41ca_b706_df5e157576eb.slice/crio-279430a122b5f1dac181ab543d7959f7e06e979b1dc10ce71558bd5a618ce27e WatchSource:0}: Error finding container 279430a122b5f1dac181ab543d7959f7e06e979b1dc10ce71558bd5a618ce27e: Status 404 returned error can't find the container with id 279430a122b5f1dac181ab543d7959f7e06e979b1dc10ce71558bd5a618ce27e Nov 25 15:37:34 crc kubenswrapper[4704]: W1125 15:37:34.465498 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d946af5_4a4e_476e_ad32_3eae6ad6c8f7.slice/crio-e5f27c49be2e63772d4797e72926c5f5ba7f324d6159d02ba3cd131a04b15263 WatchSource:0}: Error finding container e5f27c49be2e63772d4797e72926c5f5ba7f324d6159d02ba3cd131a04b15263: Status 404 returned error can't find the container with id e5f27c49be2e63772d4797e72926c5f5ba7f324d6159d02ba3cd131a04b15263 Nov 25 15:37:34 crc kubenswrapper[4704]: W1125 15:37:34.467564 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49cc56f5_b9bb_4694_8965_0e7c6a4aaae6.slice/crio-15b0ee63a8404dad7e624111bc5f2aec382b9e345c36a329499906c38847a203 WatchSource:0}: Error finding container 15b0ee63a8404dad7e624111bc5f2aec382b9e345c36a329499906c38847a203: Status 404 returned error can't find the container with id 15b0ee63a8404dad7e624111bc5f2aec382b9e345c36a329499906c38847a203 Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.474029 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.493811 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.515745 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.540677 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.554140 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.576177 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.594702 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.614724 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.625395 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7b416e1d-da7d-4da7-9bae-210c815d4cf1-images\") pod \"machine-api-operator-5694c8668f-fz52t\" (UID: \"7b416e1d-da7d-4da7-9bae-210c815d4cf1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fz52t" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.625481 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7b416e1d-da7d-4da7-9bae-210c815d4cf1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fz52t\" (UID: \"7b416e1d-da7d-4da7-9bae-210c815d4cf1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fz52t" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.625501 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b416e1d-da7d-4da7-9bae-210c815d4cf1-config\") pod \"machine-api-operator-5694c8668f-fz52t\" (UID: \"7b416e1d-da7d-4da7-9bae-210c815d4cf1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fz52t" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.625560 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b54a115-ed61-47a7-b447-400aa4f75b1b-config\") pod \"machine-approver-56656f9798-c7s9c\" (UID: \"4b54a115-ed61-47a7-b447-400aa4f75b1b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c7s9c" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.635856 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.636056 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gptmb"] Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.648315 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk"] Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.654609 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.683240 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.695250 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.715016 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.734288 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.754814 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.774106 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.795013 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.815159 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.834483 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.855309 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.874476 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.893626 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.914760 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.933959 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 25 15:37:34 crc kubenswrapper[4704]: E1125 15:37:34.950013 4704 projected.go:288] Couldn't get configMap openshift-machine-api/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.954457 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.974352 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 25 15:37:34 crc kubenswrapper[4704]: I1125 15:37:34.995068 4704 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.013973 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.034965 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.053925 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.075952 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.094021 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.105900 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7pmpw" event={"ID":"6e167ba8-a633-42df-963a-913ba4fe20bf","Type":"ContainerStarted","Data":"2e711ca0869e046690c866ae7da933a964c71a5e1199a26f27f067451ec53e1b"} Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.107503 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" event={"ID":"4ce56dcb-a916-41ca-b706-df5e157576eb","Type":"ContainerStarted","Data":"279430a122b5f1dac181ab543d7959f7e06e979b1dc10ce71558bd5a618ce27e"} Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.108686 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-gbzgh" event={"ID":"fe8e9530-3977-4dc5-abe0-f8c655b58f6a","Type":"ContainerStarted","Data":"ee2babbdde2ee9e479bdd8a20265056d95d78ccf20e7c930ea09ee2c2890e12d"} Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.109820 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-pjf9j" event={"ID":"d60f2c5d-3a97-47a6-a311-dffb74233746","Type":"ContainerStarted","Data":"ba061b74330a28b102ed87f04ce3c540e202ecd469387ab901ac9252793ffc44"} Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.111291 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" event={"ID":"17320021-32dc-4bef-befa-fa0a7c2b8533","Type":"ContainerStarted","Data":"2c0a2e3c359b50e830c05cbdeb911b9ea3ca746287e3507b7a140088d0db535a"} Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.112549 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s" event={"ID":"49cc56f5-b9bb-4694-8965-0e7c6a4aaae6","Type":"ContainerStarted","Data":"15b0ee63a8404dad7e624111bc5f2aec382b9e345c36a329499906c38847a203"} Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.113634 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zvh6p" event={"ID":"23e9d749-57d6-4ce1-a899-44b745738978","Type":"ContainerStarted","Data":"375d99707eea4a5e6c8e1f5a5b3b5fe68d6305a6dab574facfe581a459b07333"} Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.114156 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.114507 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7q5r8" event={"ID":"42554b00-c5ca-41d5-b84e-af36e56239c6","Type":"ContainerStarted","Data":"41f303bac056e0931a0b87279525e06bc99c19503f4b16410c9ac8dfb9d3481f"} Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.115901 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-z692d" event={"ID":"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7","Type":"ContainerStarted","Data":"e5f27c49be2e63772d4797e72926c5f5ba7f324d6159d02ba3cd131a04b15263"} Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.132235 4704 request.go:700] Waited for 1.933608091s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Dcanary-serving-cert&limit=500&resourceVersion=0 Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.134736 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.155010 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.174704 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.211984 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t94w6\" (UniqueName: \"kubernetes.io/projected/2b2cf6a0-7242-42e1-a1a2-2c76cfc7bcd9-kube-api-access-t94w6\") pod \"olm-operator-6b444d44fb-d8lwt\" (UID: \"2b2cf6a0-7242-42e1-a1a2-2c76cfc7bcd9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d8lwt" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.234765 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvt2w\" (UniqueName: \"kubernetes.io/projected/dd45c23d-4eaf-40d9-a735-eb804c875a59-kube-api-access-pvt2w\") pod \"ingress-operator-5b745b69d9-gff7t\" (UID: \"dd45c23d-4eaf-40d9-a735-eb804c875a59\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gff7t" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.251305 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wklfx\" (UniqueName: \"kubernetes.io/projected/0ed8b5be-28a8-4dcf-ad65-16d392570684-kube-api-access-wklfx\") pod \"catalog-operator-68c6474976-tbhnz\" (UID: \"0ed8b5be-28a8-4dcf-ad65-16d392570684\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tbhnz" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.270258 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsm5f\" (UniqueName: \"kubernetes.io/projected/fff4a5ac-b41a-4c64-b448-5a687e16e9cd-kube-api-access-nsm5f\") pod \"dns-operator-744455d44c-qxl8w\" (UID: \"fff4a5ac-b41a-4c64-b448-5a687e16e9cd\") " pod="openshift-dns-operator/dns-operator-744455d44c-qxl8w" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.276447 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-qxl8w" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.297652 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cgjh\" (UniqueName: \"kubernetes.io/projected/5250fdec-a063-483a-9bd2-b4e11479c232-kube-api-access-4cgjh\") pod \"console-operator-58897d9998-nd74n\" (UID: \"5250fdec-a063-483a-9bd2-b4e11479c232\") " pod="openshift-console-operator/console-operator-58897d9998-nd74n" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.319200 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dph8\" (UniqueName: \"kubernetes.io/projected/ebac8a19-3d09-41ed-99c0-061a54e7e6ec-kube-api-access-7dph8\") pod \"etcd-operator-b45778765-p5r8t\" (UID: \"ebac8a19-3d09-41ed-99c0-061a54e7e6ec\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p5r8t" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.326556 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tbhnz" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.345062 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd45c23d-4eaf-40d9-a735-eb804c875a59-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gff7t\" (UID: \"dd45c23d-4eaf-40d9-a735-eb804c875a59\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gff7t" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.348043 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d8lwt" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.376132 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.394184 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 25 15:37:35 crc kubenswrapper[4704]: E1125 15:37:35.401229 4704 projected.go:194] Error preparing data for projected volume kube-api-access-txnfx for pod openshift-machine-api/machine-api-operator-5694c8668f-fz52t: failed to sync configmap cache: timed out waiting for the condition Nov 25 15:37:35 crc kubenswrapper[4704]: E1125 15:37:35.401605 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7b416e1d-da7d-4da7-9bae-210c815d4cf1-kube-api-access-txnfx podName:7b416e1d-da7d-4da7-9bae-210c815d4cf1 nodeName:}" failed. No retries permitted until 2025-11-25 15:37:35.901564692 +0000 UTC m=+142.169838483 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-txnfx" (UniqueName: "kubernetes.io/projected/7b416e1d-da7d-4da7-9bae-210c815d4cf1-kube-api-access-txnfx") pod "machine-api-operator-5694c8668f-fz52t" (UID: "7b416e1d-da7d-4da7-9bae-210c815d4cf1") : failed to sync configmap cache: timed out waiting for the condition Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.416328 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.436012 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.438068 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b54a115-ed61-47a7-b447-400aa4f75b1b-config\") pod \"machine-approver-56656f9798-c7s9c\" (UID: \"4b54a115-ed61-47a7-b447-400aa4f75b1b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c7s9c" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.442043 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-installation-pull-secrets\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.442110 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a37f016b-da0c-4046-9dbe-c4ce4eb4fcfc-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mm4pt\" (UID: \"a37f016b-da0c-4046-9dbe-c4ce4eb4fcfc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mm4pt" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.442140 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-registry-certificates\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.442205 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-ca-trust-extracted\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.442924 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppgct\" (UniqueName: \"kubernetes.io/projected/5158c6c8-eef4-48f4-82c7-6db821d6894e-kube-api-access-ppgct\") pod \"openshift-controller-manager-operator-756b6f6bc6-wj7zx\" (UID: \"5158c6c8-eef4-48f4-82c7-6db821d6894e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wj7zx" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.443021 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-bound-sa-token\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.443133 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.443183 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gmqq\" (UniqueName: \"kubernetes.io/projected/01031250-19c0-4447-890a-ead82e3257ff-kube-api-access-7gmqq\") pod \"packageserver-d55dfcdfc-xqbjw\" (UID: \"01031250-19c0-4447-890a-ead82e3257ff\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xqbjw" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.443329 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-trusted-ca\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.443369 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/01031250-19c0-4447-890a-ead82e3257ff-tmpfs\") pod \"packageserver-d55dfcdfc-xqbjw\" (UID: \"01031250-19c0-4447-890a-ead82e3257ff\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xqbjw" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.443433 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-registry-tls\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.443487 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5158c6c8-eef4-48f4-82c7-6db821d6894e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wj7zx\" (UID: \"5158c6c8-eef4-48f4-82c7-6db821d6894e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wj7zx" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.443551 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/01031250-19c0-4447-890a-ead82e3257ff-apiservice-cert\") pod \"packageserver-d55dfcdfc-xqbjw\" (UID: \"01031250-19c0-4447-890a-ead82e3257ff\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xqbjw" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.443566 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/01031250-19c0-4447-890a-ead82e3257ff-webhook-cert\") pod \"packageserver-d55dfcdfc-xqbjw\" (UID: \"01031250-19c0-4447-890a-ead82e3257ff\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xqbjw" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.443594 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2rbw\" (UniqueName: \"kubernetes.io/projected/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-kube-api-access-s2rbw\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.443619 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwgm7\" (UniqueName: \"kubernetes.io/projected/a37f016b-da0c-4046-9dbe-c4ce4eb4fcfc-kube-api-access-vwgm7\") pod \"package-server-manager-789f6589d5-mm4pt\" (UID: \"a37f016b-da0c-4046-9dbe-c4ce4eb4fcfc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mm4pt" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.443660 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5158c6c8-eef4-48f4-82c7-6db821d6894e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wj7zx\" (UID: \"5158c6c8-eef4-48f4-82c7-6db821d6894e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wj7zx" Nov 25 15:37:35 crc kubenswrapper[4704]: E1125 15:37:35.443985 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:35.943934479 +0000 UTC m=+142.212208530 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.455064 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.457096 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7b416e1d-da7d-4da7-9bae-210c815d4cf1-images\") pod \"machine-api-operator-5694c8668f-fz52t\" (UID: \"7b416e1d-da7d-4da7-9bae-210c815d4cf1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fz52t" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.481199 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.487301 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b416e1d-da7d-4da7-9bae-210c815d4cf1-config\") pod \"machine-api-operator-5694c8668f-fz52t\" (UID: \"7b416e1d-da7d-4da7-9bae-210c815d4cf1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fz52t" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.496315 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.503899 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7b416e1d-da7d-4da7-9bae-210c815d4cf1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fz52t\" (UID: \"7b416e1d-da7d-4da7-9bae-210c815d4cf1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fz52t" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.561041 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-nd74n" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.561276 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.561680 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ea3a5b0d-7617-4ff6-8170-b6d692474a2a-signing-key\") pod \"service-ca-9c57cc56f-7h6hx\" (UID: \"ea3a5b0d-7617-4ff6-8170-b6d692474a2a\") " pod="openshift-service-ca/service-ca-9c57cc56f-7h6hx" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.562198 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ea3a5b0d-7617-4ff6-8170-b6d692474a2a-signing-cabundle\") pod \"service-ca-9c57cc56f-7h6hx\" (UID: \"ea3a5b0d-7617-4ff6-8170-b6d692474a2a\") " pod="openshift-service-ca/service-ca-9c57cc56f-7h6hx" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.562238 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/35b36add-59c3-4cb4-940e-76535d4d7479-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-jrgmq\" (UID: \"35b36add-59c3-4cb4-940e-76535d4d7479\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jrgmq" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.562285 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/66e027ec-5c2f-4314-8572-052d7202f17c-registration-dir\") pod \"csi-hostpathplugin-b697g\" (UID: \"66e027ec-5c2f-4314-8572-052d7202f17c\") " pod="hostpath-provisioner/csi-hostpathplugin-b697g" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.562315 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7f97ced7-4114-4d79-be3a-8f419ae80727-images\") pod \"machine-config-operator-74547568cd-8df8d\" (UID: \"7f97ced7-4114-4d79-be3a-8f419ae80727\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8df8d" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.562371 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-ca-trust-extracted\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.562614 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c7s9c" Nov 25 15:37:35 crc kubenswrapper[4704]: E1125 15:37:35.562632 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:36.062607995 +0000 UTC m=+142.330881786 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.563303 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppgct\" (UniqueName: \"kubernetes.io/projected/5158c6c8-eef4-48f4-82c7-6db821d6894e-kube-api-access-ppgct\") pod \"openshift-controller-manager-operator-756b6f6bc6-wj7zx\" (UID: \"5158c6c8-eef4-48f4-82c7-6db821d6894e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wj7zx" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.563372 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr885\" (UniqueName: \"kubernetes.io/projected/b8100f5c-f741-4ef6-b462-d2ce26957517-kube-api-access-vr885\") pod \"control-plane-machine-set-operator-78cbb6b69f-kklq9\" (UID: \"b8100f5c-f741-4ef6-b462-d2ce26957517\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kklq9" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.563428 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-bound-sa-token\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.563493 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/66e027ec-5c2f-4314-8572-052d7202f17c-plugins-dir\") pod \"csi-hostpathplugin-b697g\" (UID: \"66e027ec-5c2f-4314-8572-052d7202f17c\") " pod="hostpath-provisioner/csi-hostpathplugin-b697g" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.563540 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a06eb3cb-5f1a-4daa-aa00-abb271c35ba1-config\") pod \"kube-apiserver-operator-766d6c64bb-9l555\" (UID: \"a06eb3cb-5f1a-4daa-aa00-abb271c35ba1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9l555" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.563602 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.563650 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/66e027ec-5c2f-4314-8572-052d7202f17c-mountpoint-dir\") pod \"csi-hostpathplugin-b697g\" (UID: \"66e027ec-5c2f-4314-8572-052d7202f17c\") " pod="hostpath-provisioner/csi-hostpathplugin-b697g" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.563695 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-ca-trust-extracted\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.563782 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gmqq\" (UniqueName: \"kubernetes.io/projected/01031250-19c0-4447-890a-ead82e3257ff-kube-api-access-7gmqq\") pod \"packageserver-d55dfcdfc-xqbjw\" (UID: \"01031250-19c0-4447-890a-ead82e3257ff\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xqbjw" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.563841 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0505a32a-4fec-40b4-a7ca-e4057a223101-certs\") pod \"machine-config-server-rxdvg\" (UID: \"0505a32a-4fec-40b4-a7ca-e4057a223101\") " pod="openshift-machine-config-operator/machine-config-server-rxdvg" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.563885 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20d0ae75-5631-4169-98bc-92333964e503-serving-cert\") pod \"service-ca-operator-777779d784-4h6ww\" (UID: \"20d0ae75-5631-4169-98bc-92333964e503\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4h6ww" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.563930 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f97ced7-4114-4d79-be3a-8f419ae80727-proxy-tls\") pod \"machine-config-operator-74547568cd-8df8d\" (UID: \"7f97ced7-4114-4d79-be3a-8f419ae80727\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8df8d" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.563975 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c7e3eb0-06b1-4391-9685-713da13f5bd1-service-ca-bundle\") pod \"router-default-5444994796-9whr7\" (UID: \"2c7e3eb0-06b1-4391-9685-713da13f5bd1\") " pod="openshift-ingress/router-default-5444994796-9whr7" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.564040 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe30d0d8-1a03-4eed-b96c-d010f181d09e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xw9w7\" (UID: \"fe30d0d8-1a03-4eed-b96c-d010f181d09e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xw9w7" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.564144 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5158c6c8-eef4-48f4-82c7-6db821d6894e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wj7zx\" (UID: \"5158c6c8-eef4-48f4-82c7-6db821d6894e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wj7zx" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.564193 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/01031250-19c0-4447-890a-ead82e3257ff-apiservice-cert\") pod \"packageserver-d55dfcdfc-xqbjw\" (UID: \"01031250-19c0-4447-890a-ead82e3257ff\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xqbjw" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.564238 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/01031250-19c0-4447-890a-ead82e3257ff-webhook-cert\") pod \"packageserver-d55dfcdfc-xqbjw\" (UID: \"01031250-19c0-4447-890a-ead82e3257ff\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xqbjw" Nov 25 15:37:35 crc kubenswrapper[4704]: E1125 15:37:35.564312 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:36.064298856 +0000 UTC m=+142.332572637 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.564457 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2rbw\" (UniqueName: \"kubernetes.io/projected/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-kube-api-access-s2rbw\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.564579 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwgm7\" (UniqueName: \"kubernetes.io/projected/a37f016b-da0c-4046-9dbe-c4ce4eb4fcfc-kube-api-access-vwgm7\") pod \"package-server-manager-789f6589d5-mm4pt\" (UID: \"a37f016b-da0c-4046-9dbe-c4ce4eb4fcfc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mm4pt" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.564621 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/034a80a3-c744-464f-a544-2f4f87ad98ed-config\") pod \"kube-controller-manager-operator-78b949d7b-r24mh\" (UID: \"034a80a3-c744-464f-a544-2f4f87ad98ed\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r24mh" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.564661 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0841ffab-5cab-4009-8b15-bbab0863a3be-metrics-tls\") pod \"dns-default-mc6bz\" (UID: \"0841ffab-5cab-4009-8b15-bbab0863a3be\") " pod="openshift-dns/dns-default-mc6bz" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.564873 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clsqc\" (UniqueName: \"kubernetes.io/projected/2790e4ce-dfef-45f3-bfe4-fd6c7d63d948-kube-api-access-clsqc\") pod \"kube-storage-version-migrator-operator-b67b599dd-vtk75\" (UID: \"2790e4ce-dfef-45f3-bfe4-fd6c7d63d948\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vtk75" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.565040 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/66e027ec-5c2f-4314-8572-052d7202f17c-socket-dir\") pod \"csi-hostpathplugin-b697g\" (UID: \"66e027ec-5c2f-4314-8572-052d7202f17c\") " pod="hostpath-provisioner/csi-hostpathplugin-b697g" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.565520 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngh66\" (UniqueName: \"kubernetes.io/projected/7f97ced7-4114-4d79-be3a-8f419ae80727-kube-api-access-ngh66\") pod \"machine-config-operator-74547568cd-8df8d\" (UID: \"7f97ced7-4114-4d79-be3a-8f419ae80727\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8df8d" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.565595 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b8100f5c-f741-4ef6-b462-d2ce26957517-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-kklq9\" (UID: \"b8100f5c-f741-4ef6-b462-d2ce26957517\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kklq9" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.567639 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2j8f\" (UniqueName: \"kubernetes.io/projected/0841ffab-5cab-4009-8b15-bbab0863a3be-kube-api-access-z2j8f\") pod \"dns-default-mc6bz\" (UID: \"0841ffab-5cab-4009-8b15-bbab0863a3be\") " pod="openshift-dns/dns-default-mc6bz" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.567691 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl65f\" (UniqueName: \"kubernetes.io/projected/6b7507cc-285f-4fa5-b478-ad6a31a6855c-kube-api-access-kl65f\") pod \"ingress-canary-qxcj2\" (UID: \"6b7507cc-285f-4fa5-b478-ad6a31a6855c\") " pod="openshift-ingress-canary/ingress-canary-qxcj2" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.567748 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t992b\" (UniqueName: \"kubernetes.io/projected/38b9ead9-033f-44cb-9657-6a078bed2c0d-kube-api-access-t992b\") pod \"collect-profiles-29401410-8g42t\" (UID: \"38b9ead9-033f-44cb-9657-6a078bed2c0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-8g42t" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.567812 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6193bcc6-1da4-414c-84df-92b1bead0762-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8hvj9\" (UID: \"6193bcc6-1da4-414c-84df-92b1bead0762\") " pod="openshift-marketplace/marketplace-operator-79b997595-8hvj9" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.567845 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b7507cc-285f-4fa5-b478-ad6a31a6855c-cert\") pod \"ingress-canary-qxcj2\" (UID: \"6b7507cc-285f-4fa5-b478-ad6a31a6855c\") " pod="openshift-ingress-canary/ingress-canary-qxcj2" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.567877 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20d0ae75-5631-4169-98bc-92333964e503-config\") pod \"service-ca-operator-777779d784-4h6ww\" (UID: \"20d0ae75-5631-4169-98bc-92333964e503\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4h6ww" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.567935 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcbcq\" (UniqueName: \"kubernetes.io/projected/35b36add-59c3-4cb4-940e-76535d4d7479-kube-api-access-qcbcq\") pod \"multus-admission-controller-857f4d67dd-jrgmq\" (UID: \"35b36add-59c3-4cb4-940e-76535d4d7479\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jrgmq" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.567964 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vd7w\" (UniqueName: \"kubernetes.io/projected/20d0ae75-5631-4169-98bc-92333964e503-kube-api-access-4vd7w\") pod \"service-ca-operator-777779d784-4h6ww\" (UID: \"20d0ae75-5631-4169-98bc-92333964e503\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4h6ww" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.567997 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c7e3eb0-06b1-4391-9685-713da13f5bd1-metrics-certs\") pod \"router-default-5444994796-9whr7\" (UID: \"2c7e3eb0-06b1-4391-9685-713da13f5bd1\") " pod="openshift-ingress/router-default-5444994796-9whr7" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.571389 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/01031250-19c0-4447-890a-ead82e3257ff-apiservice-cert\") pod \"packageserver-d55dfcdfc-xqbjw\" (UID: \"01031250-19c0-4447-890a-ead82e3257ff\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xqbjw" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.571486 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/01031250-19c0-4447-890a-ead82e3257ff-webhook-cert\") pod \"packageserver-d55dfcdfc-xqbjw\" (UID: \"01031250-19c0-4447-890a-ead82e3257ff\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xqbjw" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.575507 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5158c6c8-eef4-48f4-82c7-6db821d6894e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wj7zx\" (UID: \"5158c6c8-eef4-48f4-82c7-6db821d6894e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wj7zx" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.577099 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/034a80a3-c744-464f-a544-2f4f87ad98ed-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-r24mh\" (UID: \"034a80a3-c744-464f-a544-2f4f87ad98ed\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r24mh" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.577188 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt69p\" (UniqueName: \"kubernetes.io/projected/0505a32a-4fec-40b4-a7ca-e4057a223101-kube-api-access-lt69p\") pod \"machine-config-server-rxdvg\" (UID: \"0505a32a-4fec-40b4-a7ca-e4057a223101\") " pod="openshift-machine-config-operator/machine-config-server-rxdvg" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.577239 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a06eb3cb-5f1a-4daa-aa00-abb271c35ba1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9l555\" (UID: \"a06eb3cb-5f1a-4daa-aa00-abb271c35ba1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9l555" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.577270 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2c7e3eb0-06b1-4391-9685-713da13f5bd1-stats-auth\") pod \"router-default-5444994796-9whr7\" (UID: \"2c7e3eb0-06b1-4391-9685-713da13f5bd1\") " pod="openshift-ingress/router-default-5444994796-9whr7" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.577375 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2790e4ce-dfef-45f3-bfe4-fd6c7d63d948-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-vtk75\" (UID: \"2790e4ce-dfef-45f3-bfe4-fd6c7d63d948\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vtk75" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.577420 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clvjc\" (UniqueName: \"kubernetes.io/projected/62ace4e6-38ce-414a-a549-68671f040e2d-kube-api-access-clvjc\") pod \"machine-config-controller-84d6567774-tr85d\" (UID: \"62ace4e6-38ce-414a-a549-68671f040e2d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tr85d" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.577473 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38b9ead9-033f-44cb-9657-6a078bed2c0d-secret-volume\") pod \"collect-profiles-29401410-8g42t\" (UID: \"38b9ead9-033f-44cb-9657-6a078bed2c0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-8g42t" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.577511 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-trusted-ca\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.577538 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/01031250-19c0-4447-890a-ead82e3257ff-tmpfs\") pod \"packageserver-d55dfcdfc-xqbjw\" (UID: \"01031250-19c0-4447-890a-ead82e3257ff\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xqbjw" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.577561 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfmss\" (UniqueName: \"kubernetes.io/projected/6193bcc6-1da4-414c-84df-92b1bead0762-kube-api-access-hfmss\") pod \"marketplace-operator-79b997595-8hvj9\" (UID: \"6193bcc6-1da4-414c-84df-92b1bead0762\") " pod="openshift-marketplace/marketplace-operator-79b997595-8hvj9" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.580461 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/01031250-19c0-4447-890a-ead82e3257ff-tmpfs\") pod \"packageserver-d55dfcdfc-xqbjw\" (UID: \"01031250-19c0-4447-890a-ead82e3257ff\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xqbjw" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.580845 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2c7e3eb0-06b1-4391-9685-713da13f5bd1-default-certificate\") pod \"router-default-5444994796-9whr7\" (UID: \"2c7e3eb0-06b1-4391-9685-713da13f5bd1\") " pod="openshift-ingress/router-default-5444994796-9whr7" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.581005 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/034a80a3-c744-464f-a544-2f4f87ad98ed-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-r24mh\" (UID: \"034a80a3-c744-464f-a544-2f4f87ad98ed\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r24mh" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.581056 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0841ffab-5cab-4009-8b15-bbab0863a3be-config-volume\") pod \"dns-default-mc6bz\" (UID: \"0841ffab-5cab-4009-8b15-bbab0863a3be\") " pod="openshift-dns/dns-default-mc6bz" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.581200 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-registry-tls\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.581242 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe30d0d8-1a03-4eed-b96c-d010f181d09e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xw9w7\" (UID: \"fe30d0d8-1a03-4eed-b96c-d010f181d09e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xw9w7" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.581673 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-trusted-ca\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.581712 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/66e027ec-5c2f-4314-8572-052d7202f17c-csi-data-dir\") pod \"csi-hostpathplugin-b697g\" (UID: \"66e027ec-5c2f-4314-8572-052d7202f17c\") " pod="hostpath-provisioner/csi-hostpathplugin-b697g" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.581962 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2790e4ce-dfef-45f3-bfe4-fd6c7d63d948-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-vtk75\" (UID: \"2790e4ce-dfef-45f3-bfe4-fd6c7d63d948\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vtk75" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.582174 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bdc5\" (UniqueName: \"kubernetes.io/projected/66e027ec-5c2f-4314-8572-052d7202f17c-kube-api-access-2bdc5\") pod \"csi-hostpathplugin-b697g\" (UID: \"66e027ec-5c2f-4314-8572-052d7202f17c\") " pod="hostpath-provisioner/csi-hostpathplugin-b697g" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.582238 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7f97ced7-4114-4d79-be3a-8f419ae80727-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8df8d\" (UID: \"7f97ced7-4114-4d79-be3a-8f419ae80727\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8df8d" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.582444 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a06eb3cb-5f1a-4daa-aa00-abb271c35ba1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9l555\" (UID: \"a06eb3cb-5f1a-4daa-aa00-abb271c35ba1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9l555" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.582489 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe30d0d8-1a03-4eed-b96c-d010f181d09e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xw9w7\" (UID: \"fe30d0d8-1a03-4eed-b96c-d010f181d09e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xw9w7" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.582611 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95xhk\" (UniqueName: \"kubernetes.io/projected/3b78cd0f-f99d-4774-bc16-002fd09387ed-kube-api-access-95xhk\") pod \"migrator-59844c95c7-vsp9t\" (UID: \"3b78cd0f-f99d-4774-bc16-002fd09387ed\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vsp9t" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.582963 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/62ace4e6-38ce-414a-a549-68671f040e2d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tr85d\" (UID: \"62ace4e6-38ce-414a-a549-68671f040e2d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tr85d" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.583153 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5158c6c8-eef4-48f4-82c7-6db821d6894e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wj7zx\" (UID: \"5158c6c8-eef4-48f4-82c7-6db821d6894e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wj7zx" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.583211 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38b9ead9-033f-44cb-9657-6a078bed2c0d-config-volume\") pod \"collect-profiles-29401410-8g42t\" (UID: \"38b9ead9-033f-44cb-9657-6a078bed2c0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-8g42t" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.583339 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6193bcc6-1da4-414c-84df-92b1bead0762-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8hvj9\" (UID: \"6193bcc6-1da4-414c-84df-92b1bead0762\") " pod="openshift-marketplace/marketplace-operator-79b997595-8hvj9" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.584495 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5158c6c8-eef4-48f4-82c7-6db821d6894e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wj7zx\" (UID: \"5158c6c8-eef4-48f4-82c7-6db821d6894e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wj7zx" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.588595 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qxl8w"] Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.588754 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0505a32a-4fec-40b4-a7ca-e4057a223101-node-bootstrap-token\") pod \"machine-config-server-rxdvg\" (UID: \"0505a32a-4fec-40b4-a7ca-e4057a223101\") " pod="openshift-machine-config-operator/machine-config-server-rxdvg" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.588854 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nw2b\" (UniqueName: \"kubernetes.io/projected/2c7e3eb0-06b1-4391-9685-713da13f5bd1-kube-api-access-8nw2b\") pod \"router-default-5444994796-9whr7\" (UID: \"2c7e3eb0-06b1-4391-9685-713da13f5bd1\") " pod="openshift-ingress/router-default-5444994796-9whr7" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.588896 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-installation-pull-secrets\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.588930 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a37f016b-da0c-4046-9dbe-c4ce4eb4fcfc-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mm4pt\" (UID: \"a37f016b-da0c-4046-9dbe-c4ce4eb4fcfc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mm4pt" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.588964 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/62ace4e6-38ce-414a-a549-68671f040e2d-proxy-tls\") pod \"machine-config-controller-84d6567774-tr85d\" (UID: \"62ace4e6-38ce-414a-a549-68671f040e2d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tr85d" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.589003 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-registry-certificates\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.589033 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv7ph\" (UniqueName: \"kubernetes.io/projected/ea3a5b0d-7617-4ff6-8170-b6d692474a2a-kube-api-access-jv7ph\") pod \"service-ca-9c57cc56f-7h6hx\" (UID: \"ea3a5b0d-7617-4ff6-8170-b6d692474a2a\") " pod="openshift-service-ca/service-ca-9c57cc56f-7h6hx" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.591406 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-registry-tls\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.593848 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-installation-pull-secrets\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.596747 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-registry-certificates\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.597861 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a37f016b-da0c-4046-9dbe-c4ce4eb4fcfc-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mm4pt\" (UID: \"a37f016b-da0c-4046-9dbe-c4ce4eb4fcfc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mm4pt" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.603213 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-bound-sa-token\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:35 crc kubenswrapper[4704]: W1125 15:37:35.609232 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfff4a5ac_b41a_4c64_b448_5a687e16e9cd.slice/crio-6de3d44ed5f6808e52aa1065879b5104d4888321aad6f8942fc72d38036d44f1 WatchSource:0}: Error finding container 6de3d44ed5f6808e52aa1065879b5104d4888321aad6f8942fc72d38036d44f1: Status 404 returned error can't find the container with id 6de3d44ed5f6808e52aa1065879b5104d4888321aad6f8942fc72d38036d44f1 Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.610216 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gff7t" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.618761 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-p5r8t" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.624643 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tbhnz"] Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.636330 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwgm7\" (UniqueName: \"kubernetes.io/projected/a37f016b-da0c-4046-9dbe-c4ce4eb4fcfc-kube-api-access-vwgm7\") pod \"package-server-manager-789f6589d5-mm4pt\" (UID: \"a37f016b-da0c-4046-9dbe-c4ce4eb4fcfc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mm4pt" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.640516 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mm4pt" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.646746 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2rbw\" (UniqueName: \"kubernetes.io/projected/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-kube-api-access-s2rbw\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:35 crc kubenswrapper[4704]: W1125 15:37:35.651196 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ed8b5be_28a8_4dcf_ad65_16d392570684.slice/crio-787b0b3c9113ba1cdb7f4656b3f3b16b872c645af93e4100b282235fecfd74bf WatchSource:0}: Error finding container 787b0b3c9113ba1cdb7f4656b3f3b16b872c645af93e4100b282235fecfd74bf: Status 404 returned error can't find the container with id 787b0b3c9113ba1cdb7f4656b3f3b16b872c645af93e4100b282235fecfd74bf Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.656770 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gmqq\" (UniqueName: \"kubernetes.io/projected/01031250-19c0-4447-890a-ead82e3257ff-kube-api-access-7gmqq\") pod \"packageserver-d55dfcdfc-xqbjw\" (UID: \"01031250-19c0-4447-890a-ead82e3257ff\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xqbjw" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.671667 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d8lwt"] Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.682691 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppgct\" (UniqueName: \"kubernetes.io/projected/5158c6c8-eef4-48f4-82c7-6db821d6894e-kube-api-access-ppgct\") pod \"openshift-controller-manager-operator-756b6f6bc6-wj7zx\" (UID: \"5158c6c8-eef4-48f4-82c7-6db821d6894e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wj7zx" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.689760 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690003 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr885\" (UniqueName: \"kubernetes.io/projected/b8100f5c-f741-4ef6-b462-d2ce26957517-kube-api-access-vr885\") pod \"control-plane-machine-set-operator-78cbb6b69f-kklq9\" (UID: \"b8100f5c-f741-4ef6-b462-d2ce26957517\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kklq9" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690031 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/66e027ec-5c2f-4314-8572-052d7202f17c-plugins-dir\") pod \"csi-hostpathplugin-b697g\" (UID: \"66e027ec-5c2f-4314-8572-052d7202f17c\") " pod="hostpath-provisioner/csi-hostpathplugin-b697g" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690048 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a06eb3cb-5f1a-4daa-aa00-abb271c35ba1-config\") pod \"kube-apiserver-operator-766d6c64bb-9l555\" (UID: \"a06eb3cb-5f1a-4daa-aa00-abb271c35ba1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9l555" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690079 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/66e027ec-5c2f-4314-8572-052d7202f17c-mountpoint-dir\") pod \"csi-hostpathplugin-b697g\" (UID: \"66e027ec-5c2f-4314-8572-052d7202f17c\") " pod="hostpath-provisioner/csi-hostpathplugin-b697g" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690095 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0505a32a-4fec-40b4-a7ca-e4057a223101-certs\") pod \"machine-config-server-rxdvg\" (UID: \"0505a32a-4fec-40b4-a7ca-e4057a223101\") " pod="openshift-machine-config-operator/machine-config-server-rxdvg" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690113 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20d0ae75-5631-4169-98bc-92333964e503-serving-cert\") pod \"service-ca-operator-777779d784-4h6ww\" (UID: \"20d0ae75-5631-4169-98bc-92333964e503\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4h6ww" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690468 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f97ced7-4114-4d79-be3a-8f419ae80727-proxy-tls\") pod \"machine-config-operator-74547568cd-8df8d\" (UID: \"7f97ced7-4114-4d79-be3a-8f419ae80727\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8df8d" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690488 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c7e3eb0-06b1-4391-9685-713da13f5bd1-service-ca-bundle\") pod \"router-default-5444994796-9whr7\" (UID: \"2c7e3eb0-06b1-4391-9685-713da13f5bd1\") " pod="openshift-ingress/router-default-5444994796-9whr7" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690512 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe30d0d8-1a03-4eed-b96c-d010f181d09e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xw9w7\" (UID: \"fe30d0d8-1a03-4eed-b96c-d010f181d09e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xw9w7" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690541 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/034a80a3-c744-464f-a544-2f4f87ad98ed-config\") pod \"kube-controller-manager-operator-78b949d7b-r24mh\" (UID: \"034a80a3-c744-464f-a544-2f4f87ad98ed\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r24mh" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690561 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0841ffab-5cab-4009-8b15-bbab0863a3be-metrics-tls\") pod \"dns-default-mc6bz\" (UID: \"0841ffab-5cab-4009-8b15-bbab0863a3be\") " pod="openshift-dns/dns-default-mc6bz" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690585 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clsqc\" (UniqueName: \"kubernetes.io/projected/2790e4ce-dfef-45f3-bfe4-fd6c7d63d948-kube-api-access-clsqc\") pod \"kube-storage-version-migrator-operator-b67b599dd-vtk75\" (UID: \"2790e4ce-dfef-45f3-bfe4-fd6c7d63d948\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vtk75" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690604 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngh66\" (UniqueName: \"kubernetes.io/projected/7f97ced7-4114-4d79-be3a-8f419ae80727-kube-api-access-ngh66\") pod \"machine-config-operator-74547568cd-8df8d\" (UID: \"7f97ced7-4114-4d79-be3a-8f419ae80727\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8df8d" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690620 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b8100f5c-f741-4ef6-b462-d2ce26957517-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-kklq9\" (UID: \"b8100f5c-f741-4ef6-b462-d2ce26957517\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kklq9" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690638 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/66e027ec-5c2f-4314-8572-052d7202f17c-socket-dir\") pod \"csi-hostpathplugin-b697g\" (UID: \"66e027ec-5c2f-4314-8572-052d7202f17c\") " pod="hostpath-provisioner/csi-hostpathplugin-b697g" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690661 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2j8f\" (UniqueName: \"kubernetes.io/projected/0841ffab-5cab-4009-8b15-bbab0863a3be-kube-api-access-z2j8f\") pod \"dns-default-mc6bz\" (UID: \"0841ffab-5cab-4009-8b15-bbab0863a3be\") " pod="openshift-dns/dns-default-mc6bz" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690676 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl65f\" (UniqueName: \"kubernetes.io/projected/6b7507cc-285f-4fa5-b478-ad6a31a6855c-kube-api-access-kl65f\") pod \"ingress-canary-qxcj2\" (UID: \"6b7507cc-285f-4fa5-b478-ad6a31a6855c\") " pod="openshift-ingress-canary/ingress-canary-qxcj2" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690692 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t992b\" (UniqueName: \"kubernetes.io/projected/38b9ead9-033f-44cb-9657-6a078bed2c0d-kube-api-access-t992b\") pod \"collect-profiles-29401410-8g42t\" (UID: \"38b9ead9-033f-44cb-9657-6a078bed2c0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-8g42t" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690711 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6193bcc6-1da4-414c-84df-92b1bead0762-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8hvj9\" (UID: \"6193bcc6-1da4-414c-84df-92b1bead0762\") " pod="openshift-marketplace/marketplace-operator-79b997595-8hvj9" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690729 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b7507cc-285f-4fa5-b478-ad6a31a6855c-cert\") pod \"ingress-canary-qxcj2\" (UID: \"6b7507cc-285f-4fa5-b478-ad6a31a6855c\") " pod="openshift-ingress-canary/ingress-canary-qxcj2" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690742 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20d0ae75-5631-4169-98bc-92333964e503-config\") pod \"service-ca-operator-777779d784-4h6ww\" (UID: \"20d0ae75-5631-4169-98bc-92333964e503\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4h6ww" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690776 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcbcq\" (UniqueName: \"kubernetes.io/projected/35b36add-59c3-4cb4-940e-76535d4d7479-kube-api-access-qcbcq\") pod \"multus-admission-controller-857f4d67dd-jrgmq\" (UID: \"35b36add-59c3-4cb4-940e-76535d4d7479\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jrgmq" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690823 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vd7w\" (UniqueName: \"kubernetes.io/projected/20d0ae75-5631-4169-98bc-92333964e503-kube-api-access-4vd7w\") pod \"service-ca-operator-777779d784-4h6ww\" (UID: \"20d0ae75-5631-4169-98bc-92333964e503\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4h6ww" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690843 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c7e3eb0-06b1-4391-9685-713da13f5bd1-metrics-certs\") pod \"router-default-5444994796-9whr7\" (UID: \"2c7e3eb0-06b1-4391-9685-713da13f5bd1\") " pod="openshift-ingress/router-default-5444994796-9whr7" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690863 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/034a80a3-c744-464f-a544-2f4f87ad98ed-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-r24mh\" (UID: \"034a80a3-c744-464f-a544-2f4f87ad98ed\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r24mh" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690883 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt69p\" (UniqueName: \"kubernetes.io/projected/0505a32a-4fec-40b4-a7ca-e4057a223101-kube-api-access-lt69p\") pod \"machine-config-server-rxdvg\" (UID: \"0505a32a-4fec-40b4-a7ca-e4057a223101\") " pod="openshift-machine-config-operator/machine-config-server-rxdvg" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690898 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a06eb3cb-5f1a-4daa-aa00-abb271c35ba1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9l555\" (UID: \"a06eb3cb-5f1a-4daa-aa00-abb271c35ba1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9l555" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690917 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2c7e3eb0-06b1-4391-9685-713da13f5bd1-stats-auth\") pod \"router-default-5444994796-9whr7\" (UID: \"2c7e3eb0-06b1-4391-9685-713da13f5bd1\") " pod="openshift-ingress/router-default-5444994796-9whr7" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690948 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2790e4ce-dfef-45f3-bfe4-fd6c7d63d948-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-vtk75\" (UID: \"2790e4ce-dfef-45f3-bfe4-fd6c7d63d948\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vtk75" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690965 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clvjc\" (UniqueName: \"kubernetes.io/projected/62ace4e6-38ce-414a-a549-68671f040e2d-kube-api-access-clvjc\") pod \"machine-config-controller-84d6567774-tr85d\" (UID: \"62ace4e6-38ce-414a-a549-68671f040e2d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tr85d" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.690987 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38b9ead9-033f-44cb-9657-6a078bed2c0d-secret-volume\") pod \"collect-profiles-29401410-8g42t\" (UID: \"38b9ead9-033f-44cb-9657-6a078bed2c0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-8g42t" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.691024 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfmss\" (UniqueName: \"kubernetes.io/projected/6193bcc6-1da4-414c-84df-92b1bead0762-kube-api-access-hfmss\") pod \"marketplace-operator-79b997595-8hvj9\" (UID: \"6193bcc6-1da4-414c-84df-92b1bead0762\") " pod="openshift-marketplace/marketplace-operator-79b997595-8hvj9" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.691051 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/034a80a3-c744-464f-a544-2f4f87ad98ed-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-r24mh\" (UID: \"034a80a3-c744-464f-a544-2f4f87ad98ed\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r24mh" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.691071 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2c7e3eb0-06b1-4391-9685-713da13f5bd1-default-certificate\") pod \"router-default-5444994796-9whr7\" (UID: \"2c7e3eb0-06b1-4391-9685-713da13f5bd1\") " pod="openshift-ingress/router-default-5444994796-9whr7" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.691093 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0841ffab-5cab-4009-8b15-bbab0863a3be-config-volume\") pod \"dns-default-mc6bz\" (UID: \"0841ffab-5cab-4009-8b15-bbab0863a3be\") " pod="openshift-dns/dns-default-mc6bz" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.691132 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe30d0d8-1a03-4eed-b96c-d010f181d09e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xw9w7\" (UID: \"fe30d0d8-1a03-4eed-b96c-d010f181d09e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xw9w7" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.691157 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/66e027ec-5c2f-4314-8572-052d7202f17c-csi-data-dir\") pod \"csi-hostpathplugin-b697g\" (UID: \"66e027ec-5c2f-4314-8572-052d7202f17c\") " pod="hostpath-provisioner/csi-hostpathplugin-b697g" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.691176 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2790e4ce-dfef-45f3-bfe4-fd6c7d63d948-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-vtk75\" (UID: \"2790e4ce-dfef-45f3-bfe4-fd6c7d63d948\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vtk75" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.691206 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bdc5\" (UniqueName: \"kubernetes.io/projected/66e027ec-5c2f-4314-8572-052d7202f17c-kube-api-access-2bdc5\") pod \"csi-hostpathplugin-b697g\" (UID: \"66e027ec-5c2f-4314-8572-052d7202f17c\") " pod="hostpath-provisioner/csi-hostpathplugin-b697g" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.691226 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7f97ced7-4114-4d79-be3a-8f419ae80727-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8df8d\" (UID: \"7f97ced7-4114-4d79-be3a-8f419ae80727\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8df8d" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.691245 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a06eb3cb-5f1a-4daa-aa00-abb271c35ba1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9l555\" (UID: \"a06eb3cb-5f1a-4daa-aa00-abb271c35ba1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9l555" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.691267 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe30d0d8-1a03-4eed-b96c-d010f181d09e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xw9w7\" (UID: \"fe30d0d8-1a03-4eed-b96c-d010f181d09e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xw9w7" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.691294 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95xhk\" (UniqueName: \"kubernetes.io/projected/3b78cd0f-f99d-4774-bc16-002fd09387ed-kube-api-access-95xhk\") pod \"migrator-59844c95c7-vsp9t\" (UID: \"3b78cd0f-f99d-4774-bc16-002fd09387ed\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vsp9t" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.691318 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/62ace4e6-38ce-414a-a549-68671f040e2d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tr85d\" (UID: \"62ace4e6-38ce-414a-a549-68671f040e2d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tr85d" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.691341 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38b9ead9-033f-44cb-9657-6a078bed2c0d-config-volume\") pod \"collect-profiles-29401410-8g42t\" (UID: \"38b9ead9-033f-44cb-9657-6a078bed2c0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-8g42t" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.691362 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0505a32a-4fec-40b4-a7ca-e4057a223101-node-bootstrap-token\") pod \"machine-config-server-rxdvg\" (UID: \"0505a32a-4fec-40b4-a7ca-e4057a223101\") " pod="openshift-machine-config-operator/machine-config-server-rxdvg" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.691384 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6193bcc6-1da4-414c-84df-92b1bead0762-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8hvj9\" (UID: \"6193bcc6-1da4-414c-84df-92b1bead0762\") " pod="openshift-marketplace/marketplace-operator-79b997595-8hvj9" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.691408 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nw2b\" (UniqueName: \"kubernetes.io/projected/2c7e3eb0-06b1-4391-9685-713da13f5bd1-kube-api-access-8nw2b\") pod \"router-default-5444994796-9whr7\" (UID: \"2c7e3eb0-06b1-4391-9685-713da13f5bd1\") " pod="openshift-ingress/router-default-5444994796-9whr7" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.691441 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/62ace4e6-38ce-414a-a549-68671f040e2d-proxy-tls\") pod \"machine-config-controller-84d6567774-tr85d\" (UID: \"62ace4e6-38ce-414a-a549-68671f040e2d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tr85d" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.691465 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv7ph\" (UniqueName: \"kubernetes.io/projected/ea3a5b0d-7617-4ff6-8170-b6d692474a2a-kube-api-access-jv7ph\") pod \"service-ca-9c57cc56f-7h6hx\" (UID: \"ea3a5b0d-7617-4ff6-8170-b6d692474a2a\") " pod="openshift-service-ca/service-ca-9c57cc56f-7h6hx" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.691486 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ea3a5b0d-7617-4ff6-8170-b6d692474a2a-signing-key\") pod \"service-ca-9c57cc56f-7h6hx\" (UID: \"ea3a5b0d-7617-4ff6-8170-b6d692474a2a\") " pod="openshift-service-ca/service-ca-9c57cc56f-7h6hx" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.691507 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ea3a5b0d-7617-4ff6-8170-b6d692474a2a-signing-cabundle\") pod \"service-ca-9c57cc56f-7h6hx\" (UID: \"ea3a5b0d-7617-4ff6-8170-b6d692474a2a\") " pod="openshift-service-ca/service-ca-9c57cc56f-7h6hx" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.691527 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/35b36add-59c3-4cb4-940e-76535d4d7479-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-jrgmq\" (UID: \"35b36add-59c3-4cb4-940e-76535d4d7479\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jrgmq" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.691559 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/66e027ec-5c2f-4314-8572-052d7202f17c-registration-dir\") pod \"csi-hostpathplugin-b697g\" (UID: \"66e027ec-5c2f-4314-8572-052d7202f17c\") " pod="hostpath-provisioner/csi-hostpathplugin-b697g" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.691580 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7f97ced7-4114-4d79-be3a-8f419ae80727-images\") pod \"machine-config-operator-74547568cd-8df8d\" (UID: \"7f97ced7-4114-4d79-be3a-8f419ae80727\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8df8d" Nov 25 15:37:35 crc kubenswrapper[4704]: E1125 15:37:35.692996 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:36.192955207 +0000 UTC m=+142.461229008 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.693984 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/66e027ec-5c2f-4314-8572-052d7202f17c-plugins-dir\") pod \"csi-hostpathplugin-b697g\" (UID: \"66e027ec-5c2f-4314-8572-052d7202f17c\") " pod="hostpath-provisioner/csi-hostpathplugin-b697g" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.694548 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/66e027ec-5c2f-4314-8572-052d7202f17c-mountpoint-dir\") pod \"csi-hostpathplugin-b697g\" (UID: \"66e027ec-5c2f-4314-8572-052d7202f17c\") " pod="hostpath-provisioner/csi-hostpathplugin-b697g" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.696258 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38b9ead9-033f-44cb-9657-6a078bed2c0d-config-volume\") pod \"collect-profiles-29401410-8g42t\" (UID: \"38b9ead9-033f-44cb-9657-6a078bed2c0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-8g42t" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.696343 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a06eb3cb-5f1a-4daa-aa00-abb271c35ba1-config\") pod \"kube-apiserver-operator-766d6c64bb-9l555\" (UID: \"a06eb3cb-5f1a-4daa-aa00-abb271c35ba1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9l555" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.696820 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7f97ced7-4114-4d79-be3a-8f419ae80727-images\") pod \"machine-config-operator-74547568cd-8df8d\" (UID: \"7f97ced7-4114-4d79-be3a-8f419ae80727\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8df8d" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.697621 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0841ffab-5cab-4009-8b15-bbab0863a3be-config-volume\") pod \"dns-default-mc6bz\" (UID: \"0841ffab-5cab-4009-8b15-bbab0863a3be\") " pod="openshift-dns/dns-default-mc6bz" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.699719 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/66e027ec-5c2f-4314-8572-052d7202f17c-socket-dir\") pod \"csi-hostpathplugin-b697g\" (UID: \"66e027ec-5c2f-4314-8572-052d7202f17c\") " pod="hostpath-provisioner/csi-hostpathplugin-b697g" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.700667 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20d0ae75-5631-4169-98bc-92333964e503-config\") pod \"service-ca-operator-777779d784-4h6ww\" (UID: \"20d0ae75-5631-4169-98bc-92333964e503\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4h6ww" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.701359 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6193bcc6-1da4-414c-84df-92b1bead0762-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8hvj9\" (UID: \"6193bcc6-1da4-414c-84df-92b1bead0762\") " pod="openshift-marketplace/marketplace-operator-79b997595-8hvj9" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.705371 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2790e4ce-dfef-45f3-bfe4-fd6c7d63d948-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-vtk75\" (UID: \"2790e4ce-dfef-45f3-bfe4-fd6c7d63d948\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vtk75" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.706164 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c7e3eb0-06b1-4391-9685-713da13f5bd1-metrics-certs\") pod \"router-default-5444994796-9whr7\" (UID: \"2c7e3eb0-06b1-4391-9685-713da13f5bd1\") " pod="openshift-ingress/router-default-5444994796-9whr7" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.708437 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/66e027ec-5c2f-4314-8572-052d7202f17c-csi-data-dir\") pod \"csi-hostpathplugin-b697g\" (UID: \"66e027ec-5c2f-4314-8572-052d7202f17c\") " pod="hostpath-provisioner/csi-hostpathplugin-b697g" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.710436 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe30d0d8-1a03-4eed-b96c-d010f181d09e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xw9w7\" (UID: \"fe30d0d8-1a03-4eed-b96c-d010f181d09e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xw9w7" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.710819 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ea3a5b0d-7617-4ff6-8170-b6d692474a2a-signing-cabundle\") pod \"service-ca-9c57cc56f-7h6hx\" (UID: \"ea3a5b0d-7617-4ff6-8170-b6d692474a2a\") " pod="openshift-service-ca/service-ca-9c57cc56f-7h6hx" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.711548 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/66e027ec-5c2f-4314-8572-052d7202f17c-registration-dir\") pod \"csi-hostpathplugin-b697g\" (UID: \"66e027ec-5c2f-4314-8572-052d7202f17c\") " pod="hostpath-provisioner/csi-hostpathplugin-b697g" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.711956 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7f97ced7-4114-4d79-be3a-8f419ae80727-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8df8d\" (UID: \"7f97ced7-4114-4d79-be3a-8f419ae80727\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8df8d" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.712463 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2c7e3eb0-06b1-4391-9685-713da13f5bd1-default-certificate\") pod \"router-default-5444994796-9whr7\" (UID: \"2c7e3eb0-06b1-4391-9685-713da13f5bd1\") " pod="openshift-ingress/router-default-5444994796-9whr7" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.713914 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2c7e3eb0-06b1-4391-9685-713da13f5bd1-stats-auth\") pod \"router-default-5444994796-9whr7\" (UID: \"2c7e3eb0-06b1-4391-9685-713da13f5bd1\") " pod="openshift-ingress/router-default-5444994796-9whr7" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.714286 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ea3a5b0d-7617-4ff6-8170-b6d692474a2a-signing-key\") pod \"service-ca-9c57cc56f-7h6hx\" (UID: \"ea3a5b0d-7617-4ff6-8170-b6d692474a2a\") " pod="openshift-service-ca/service-ca-9c57cc56f-7h6hx" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.715420 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c7e3eb0-06b1-4391-9685-713da13f5bd1-service-ca-bundle\") pod \"router-default-5444994796-9whr7\" (UID: \"2c7e3eb0-06b1-4391-9685-713da13f5bd1\") " pod="openshift-ingress/router-default-5444994796-9whr7" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.719405 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe30d0d8-1a03-4eed-b96c-d010f181d09e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xw9w7\" (UID: \"fe30d0d8-1a03-4eed-b96c-d010f181d09e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xw9w7" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.719882 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6193bcc6-1da4-414c-84df-92b1bead0762-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8hvj9\" (UID: \"6193bcc6-1da4-414c-84df-92b1bead0762\") " pod="openshift-marketplace/marketplace-operator-79b997595-8hvj9" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.720139 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0841ffab-5cab-4009-8b15-bbab0863a3be-metrics-tls\") pod \"dns-default-mc6bz\" (UID: \"0841ffab-5cab-4009-8b15-bbab0863a3be\") " pod="openshift-dns/dns-default-mc6bz" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.723089 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a06eb3cb-5f1a-4daa-aa00-abb271c35ba1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-9l555\" (UID: \"a06eb3cb-5f1a-4daa-aa00-abb271c35ba1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9l555" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.724095 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20d0ae75-5631-4169-98bc-92333964e503-serving-cert\") pod \"service-ca-operator-777779d784-4h6ww\" (UID: \"20d0ae75-5631-4169-98bc-92333964e503\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4h6ww" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.724921 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38b9ead9-033f-44cb-9657-6a078bed2c0d-secret-volume\") pod \"collect-profiles-29401410-8g42t\" (UID: \"38b9ead9-033f-44cb-9657-6a078bed2c0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-8g42t" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.725613 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f97ced7-4114-4d79-be3a-8f419ae80727-proxy-tls\") pod \"machine-config-operator-74547568cd-8df8d\" (UID: \"7f97ced7-4114-4d79-be3a-8f419ae80727\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8df8d" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.725985 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b8100f5c-f741-4ef6-b462-d2ce26957517-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-kklq9\" (UID: \"b8100f5c-f741-4ef6-b462-d2ce26957517\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kklq9" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.726254 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0505a32a-4fec-40b4-a7ca-e4057a223101-certs\") pod \"machine-config-server-rxdvg\" (UID: \"0505a32a-4fec-40b4-a7ca-e4057a223101\") " pod="openshift-machine-config-operator/machine-config-server-rxdvg" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.726345 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/034a80a3-c744-464f-a544-2f4f87ad98ed-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-r24mh\" (UID: \"034a80a3-c744-464f-a544-2f4f87ad98ed\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r24mh" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.726969 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr885\" (UniqueName: \"kubernetes.io/projected/b8100f5c-f741-4ef6-b462-d2ce26957517-kube-api-access-vr885\") pod \"control-plane-machine-set-operator-78cbb6b69f-kklq9\" (UID: \"b8100f5c-f741-4ef6-b462-d2ce26957517\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kklq9" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.727328 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0505a32a-4fec-40b4-a7ca-e4057a223101-node-bootstrap-token\") pod \"machine-config-server-rxdvg\" (UID: \"0505a32a-4fec-40b4-a7ca-e4057a223101\") " pod="openshift-machine-config-operator/machine-config-server-rxdvg" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.727423 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2790e4ce-dfef-45f3-bfe4-fd6c7d63d948-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-vtk75\" (UID: \"2790e4ce-dfef-45f3-bfe4-fd6c7d63d948\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vtk75" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.727802 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b7507cc-285f-4fa5-b478-ad6a31a6855c-cert\") pod \"ingress-canary-qxcj2\" (UID: \"6b7507cc-285f-4fa5-b478-ad6a31a6855c\") " pod="openshift-ingress-canary/ingress-canary-qxcj2" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.728839 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/62ace4e6-38ce-414a-a549-68671f040e2d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tr85d\" (UID: \"62ace4e6-38ce-414a-a549-68671f040e2d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tr85d" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.729405 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/62ace4e6-38ce-414a-a549-68671f040e2d-proxy-tls\") pod \"machine-config-controller-84d6567774-tr85d\" (UID: \"62ace4e6-38ce-414a-a549-68671f040e2d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tr85d" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.732391 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/35b36add-59c3-4cb4-940e-76535d4d7479-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-jrgmq\" (UID: \"35b36add-59c3-4cb4-940e-76535d4d7479\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jrgmq" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.754839 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/034a80a3-c744-464f-a544-2f4f87ad98ed-config\") pod \"kube-controller-manager-operator-78b949d7b-r24mh\" (UID: \"034a80a3-c744-464f-a544-2f4f87ad98ed\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r24mh" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.765928 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95xhk\" (UniqueName: \"kubernetes.io/projected/3b78cd0f-f99d-4774-bc16-002fd09387ed-kube-api-access-95xhk\") pod \"migrator-59844c95c7-vsp9t\" (UID: \"3b78cd0f-f99d-4774-bc16-002fd09387ed\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vsp9t" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.780206 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl65f\" (UniqueName: \"kubernetes.io/projected/6b7507cc-285f-4fa5-b478-ad6a31a6855c-kube-api-access-kl65f\") pod \"ingress-canary-qxcj2\" (UID: \"6b7507cc-285f-4fa5-b478-ad6a31a6855c\") " pod="openshift-ingress-canary/ingress-canary-qxcj2" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.792722 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.793358 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vd7w\" (UniqueName: \"kubernetes.io/projected/20d0ae75-5631-4169-98bc-92333964e503-kube-api-access-4vd7w\") pod \"service-ca-operator-777779d784-4h6ww\" (UID: \"20d0ae75-5631-4169-98bc-92333964e503\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4h6ww" Nov 25 15:37:35 crc kubenswrapper[4704]: E1125 15:37:35.793841 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:36.293779366 +0000 UTC m=+142.562053357 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.810883 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qxcj2" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.817725 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t992b\" (UniqueName: \"kubernetes.io/projected/38b9ead9-033f-44cb-9657-6a078bed2c0d-kube-api-access-t992b\") pod \"collect-profiles-29401410-8g42t\" (UID: \"38b9ead9-033f-44cb-9657-6a078bed2c0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-8g42t" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.853230 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt69p\" (UniqueName: \"kubernetes.io/projected/0505a32a-4fec-40b4-a7ca-e4057a223101-kube-api-access-lt69p\") pod \"machine-config-server-rxdvg\" (UID: \"0505a32a-4fec-40b4-a7ca-e4057a223101\") " pod="openshift-machine-config-operator/machine-config-server-rxdvg" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.855403 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-nd74n"] Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.858701 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clvjc\" (UniqueName: \"kubernetes.io/projected/62ace4e6-38ce-414a-a549-68671f040e2d-kube-api-access-clvjc\") pod \"machine-config-controller-84d6567774-tr85d\" (UID: \"62ace4e6-38ce-414a-a549-68671f040e2d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tr85d" Nov 25 15:37:35 crc kubenswrapper[4704]: W1125 15:37:35.879281 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5250fdec_a063_483a_9bd2_b4e11479c232.slice/crio-821aebbed3b80215685c28d1995f59e08aec956eb8213050afc356e93891e04b WatchSource:0}: Error finding container 821aebbed3b80215685c28d1995f59e08aec956eb8213050afc356e93891e04b: Status 404 returned error can't find the container with id 821aebbed3b80215685c28d1995f59e08aec956eb8213050afc356e93891e04b Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.882597 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe30d0d8-1a03-4eed-b96c-d010f181d09e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xw9w7\" (UID: \"fe30d0d8-1a03-4eed-b96c-d010f181d09e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xw9w7" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.893922 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:35 crc kubenswrapper[4704]: E1125 15:37:35.894044 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:36.394013806 +0000 UTC m=+142.662287587 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.894289 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:35 crc kubenswrapper[4704]: E1125 15:37:35.894750 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:36.394737238 +0000 UTC m=+142.663011019 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.897212 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wj7zx" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.910726 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcbcq\" (UniqueName: \"kubernetes.io/projected/35b36add-59c3-4cb4-940e-76535d4d7479-kube-api-access-qcbcq\") pod \"multus-admission-controller-857f4d67dd-jrgmq\" (UID: \"35b36add-59c3-4cb4-940e-76535d4d7479\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jrgmq" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.930769 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv7ph\" (UniqueName: \"kubernetes.io/projected/ea3a5b0d-7617-4ff6-8170-b6d692474a2a-kube-api-access-jv7ph\") pod \"service-ca-9c57cc56f-7h6hx\" (UID: \"ea3a5b0d-7617-4ff6-8170-b6d692474a2a\") " pod="openshift-service-ca/service-ca-9c57cc56f-7h6hx" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.933263 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xqbjw" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.948710 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gff7t"] Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.950061 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfmss\" (UniqueName: \"kubernetes.io/projected/6193bcc6-1da4-414c-84df-92b1bead0762-kube-api-access-hfmss\") pod \"marketplace-operator-79b997595-8hvj9\" (UID: \"6193bcc6-1da4-414c-84df-92b1bead0762\") " pod="openshift-marketplace/marketplace-operator-79b997595-8hvj9" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.960211 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4h6ww" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.966130 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kklq9" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.969354 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-p5r8t"] Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.973609 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-8g42t" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.986564 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nw2b\" (UniqueName: \"kubernetes.io/projected/2c7e3eb0-06b1-4391-9685-713da13f5bd1-kube-api-access-8nw2b\") pod \"router-default-5444994796-9whr7\" (UID: \"2c7e3eb0-06b1-4391-9685-713da13f5bd1\") " pod="openshift-ingress/router-default-5444994796-9whr7" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.987373 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clsqc\" (UniqueName: \"kubernetes.io/projected/2790e4ce-dfef-45f3-bfe4-fd6c7d63d948-kube-api-access-clsqc\") pod \"kube-storage-version-migrator-operator-b67b599dd-vtk75\" (UID: \"2790e4ce-dfef-45f3-bfe4-fd6c7d63d948\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vtk75" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.990458 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngh66\" (UniqueName: \"kubernetes.io/projected/7f97ced7-4114-4d79-be3a-8f419ae80727-kube-api-access-ngh66\") pod \"machine-config-operator-74547568cd-8df8d\" (UID: \"7f97ced7-4114-4d79-be3a-8f419ae80727\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8df8d" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.994498 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7h6hx" Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.995461 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:35 crc kubenswrapper[4704]: I1125 15:37:35.995651 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txnfx\" (UniqueName: \"kubernetes.io/projected/7b416e1d-da7d-4da7-9bae-210c815d4cf1-kube-api-access-txnfx\") pod \"machine-api-operator-5694c8668f-fz52t\" (UID: \"7b416e1d-da7d-4da7-9bae-210c815d4cf1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fz52t" Nov 25 15:37:35 crc kubenswrapper[4704]: E1125 15:37:35.996216 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:36.496176156 +0000 UTC m=+142.764449937 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:35.999844 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txnfx\" (UniqueName: \"kubernetes.io/projected/7b416e1d-da7d-4da7-9bae-210c815d4cf1-kube-api-access-txnfx\") pod \"machine-api-operator-5694c8668f-fz52t\" (UID: \"7b416e1d-da7d-4da7-9bae-210c815d4cf1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fz52t" Nov 25 15:37:36 crc kubenswrapper[4704]: W1125 15:37:36.004552 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd45c23d_4eaf_40d9_a735_eb804c875a59.slice/crio-e572af97a885e30e1e9834f20482a5232f4e4954405d803e53241f9926e74015 WatchSource:0}: Error finding container e572af97a885e30e1e9834f20482a5232f4e4954405d803e53241f9926e74015: Status 404 returned error can't find the container with id e572af97a885e30e1e9834f20482a5232f4e4954405d803e53241f9926e74015 Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.004684 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8df8d" Nov 25 15:37:36 crc kubenswrapper[4704]: W1125 15:37:36.008249 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebac8a19_3d09_41ed_99c0_061a54e7e6ec.slice/crio-19dace1dc7b612036d6a371343047add8f2f252e325a8f02ad753945f23c5d95 WatchSource:0}: Error finding container 19dace1dc7b612036d6a371343047add8f2f252e325a8f02ad753945f23c5d95: Status 404 returned error can't find the container with id 19dace1dc7b612036d6a371343047add8f2f252e325a8f02ad753945f23c5d95 Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.011335 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vsp9t" Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.014954 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/034a80a3-c744-464f-a544-2f4f87ad98ed-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-r24mh\" (UID: \"034a80a3-c744-464f-a544-2f4f87ad98ed\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r24mh" Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.019104 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-jrgmq" Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.028160 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8hvj9" Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.029466 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2j8f\" (UniqueName: \"kubernetes.io/projected/0841ffab-5cab-4009-8b15-bbab0863a3be-kube-api-access-z2j8f\") pod \"dns-default-mc6bz\" (UID: \"0841ffab-5cab-4009-8b15-bbab0863a3be\") " pod="openshift-dns/dns-default-mc6bz" Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.035293 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tr85d" Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.043858 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xw9w7" Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.051303 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9whr7" Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.060619 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bdc5\" (UniqueName: \"kubernetes.io/projected/66e027ec-5c2f-4314-8572-052d7202f17c-kube-api-access-2bdc5\") pod \"csi-hostpathplugin-b697g\" (UID: \"66e027ec-5c2f-4314-8572-052d7202f17c\") " pod="hostpath-provisioner/csi-hostpathplugin-b697g" Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.062404 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mc6bz" Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.068390 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mm4pt"] Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.071166 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r24mh" Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.074776 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a06eb3cb-5f1a-4daa-aa00-abb271c35ba1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-9l555\" (UID: \"a06eb3cb-5f1a-4daa-aa00-abb271c35ba1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9l555" Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.095958 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qxcj2"] Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.097458 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:36 crc kubenswrapper[4704]: E1125 15:37:36.097832 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:36.597814089 +0000 UTC m=+142.866087870 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.098145 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-b697g" Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.106535 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-rxdvg" Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.134063 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gff7t" event={"ID":"dd45c23d-4eaf-40d9-a735-eb804c875a59","Type":"ContainerStarted","Data":"e572af97a885e30e1e9834f20482a5232f4e4954405d803e53241f9926e74015"} Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.140226 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-gbzgh" event={"ID":"fe8e9530-3977-4dc5-abe0-f8c655b58f6a","Type":"ContainerStarted","Data":"fe4b0240cec4d669c84a95a962a07de62164469f517807bf2ab19861e67b2575"} Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.143320 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" event={"ID":"17320021-32dc-4bef-befa-fa0a7c2b8533","Type":"ContainerStarted","Data":"97cc3b1151fc8f3b04bfe716a3f0164ec2ee6ec3f7324322e2d4db66d4f160fe"} Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.143956 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" event={"ID":"2b9c93c0-005e-4b54-a498-a4ae8418f839","Type":"ContainerStarted","Data":"5652c8d4a18996652ef96c931db01dbebaee164742dff8f6372dc8252f95ef4b"} Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.145170 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-pjf9j" event={"ID":"d60f2c5d-3a97-47a6-a311-dffb74233746","Type":"ContainerStarted","Data":"c906f0ce81caf63e6d3314bb3e1993a61b831e5e60dadc6e9d5a3012d96d2bef"} Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.146253 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-fz52t" Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.148554 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-nd74n" event={"ID":"5250fdec-a063-483a-9bd2-b4e11479c232","Type":"ContainerStarted","Data":"821aebbed3b80215685c28d1995f59e08aec956eb8213050afc356e93891e04b"} Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.154555 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gptmb" event={"ID":"0d6182fe-6ca6-4384-99ce-501079ac58ad","Type":"ContainerStarted","Data":"41e033f9b501d32f9df8251e28a8e49bd6350a52342f1a1803a0250852c2e8e7"} Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.157377 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h7k8v" event={"ID":"6a92f740-f168-4d3b-b225-a73109091d7d","Type":"ContainerStarted","Data":"0baf86cd4263a8404046a7260f9e537298c16249a37046d5ed5d0f05c50660ba"} Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.180994 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zvh6p" event={"ID":"23e9d749-57d6-4ce1-a899-44b745738978","Type":"ContainerStarted","Data":"6d9713e8b167544aee4ab82421fcec367ccb3d53cf110ae69b8a01ac7fcead39"} Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.183459 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tbhnz" event={"ID":"0ed8b5be-28a8-4dcf-ad65-16d392570684","Type":"ContainerStarted","Data":"787b0b3c9113ba1cdb7f4656b3f3b16b872c645af93e4100b282235fecfd74bf"} Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.189582 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c7s9c" event={"ID":"4b54a115-ed61-47a7-b447-400aa4f75b1b","Type":"ContainerStarted","Data":"79761c8b3091f08fa7e65a55e2107aba039e0991b97b9be1ddb48bccb5307870"} Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.199224 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:36 crc kubenswrapper[4704]: E1125 15:37:36.199379 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:36.69935838 +0000 UTC m=+142.967632161 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.199519 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:36 crc kubenswrapper[4704]: E1125 15:37:36.199956 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:36.699943917 +0000 UTC m=+142.968217688 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.202769 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d8lwt" event={"ID":"2b2cf6a0-7242-42e1-a1a2-2c76cfc7bcd9","Type":"ContainerStarted","Data":"f07a8c5f0637a278c60343a1aac73d2b717aa4b23ee98d9bd5096169cee697ed"} Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.213591 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qxl8w" event={"ID":"fff4a5ac-b41a-4c64-b448-5a687e16e9cd","Type":"ContainerStarted","Data":"6de3d44ed5f6808e52aa1065879b5104d4888321aad6f8942fc72d38036d44f1"} Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.214585 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-p5r8t" event={"ID":"ebac8a19-3d09-41ed-99c0-061a54e7e6ec","Type":"ContainerStarted","Data":"19dace1dc7b612036d6a371343047add8f2f252e325a8f02ad753945f23c5d95"} Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.216580 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7pmpw" event={"ID":"6e167ba8-a633-42df-963a-913ba4fe20bf","Type":"ContainerStarted","Data":"a344cac793225f108b87754f8b957a2f9af92486719af7b5bca96cc758de4cef"} Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.267600 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wj7zx"] Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.279604 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vtk75" Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.290442 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9l555" Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.294359 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-jrgmq"] Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.300560 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:36 crc kubenswrapper[4704]: E1125 15:37:36.301324 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:36.801298352 +0000 UTC m=+143.069572133 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.301521 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:36 crc kubenswrapper[4704]: E1125 15:37:36.302284 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:36.802252371 +0000 UTC m=+143.070526152 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.354628 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xqbjw"] Nov 25 15:37:36 crc kubenswrapper[4704]: W1125 15:37:36.383359 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b7507cc_285f_4fa5_b478_ad6a31a6855c.slice/crio-4c6b4ec0f0080ec4532555108997c2786f26fcfe90bf419e739f8f71057267b4 WatchSource:0}: Error finding container 4c6b4ec0f0080ec4532555108997c2786f26fcfe90bf419e739f8f71057267b4: Status 404 returned error can't find the container with id 4c6b4ec0f0080ec4532555108997c2786f26fcfe90bf419e739f8f71057267b4 Nov 25 15:37:36 crc kubenswrapper[4704]: W1125 15:37:36.386973 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35b36add_59c3_4cb4_940e_76535d4d7479.slice/crio-98421455b7493df6b0393f0975792e0a01c5fd4fa7416919e573fdf682243592 WatchSource:0}: Error finding container 98421455b7493df6b0393f0975792e0a01c5fd4fa7416919e573fdf682243592: Status 404 returned error can't find the container with id 98421455b7493df6b0393f0975792e0a01c5fd4fa7416919e573fdf682243592 Nov 25 15:37:36 crc kubenswrapper[4704]: W1125 15:37:36.401443 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01031250_19c0_4447_890a_ead82e3257ff.slice/crio-8b92aea13f1bd5831d77e083e65b9a3a7b4f465d45be9a421785f121b1347f7d WatchSource:0}: Error finding container 8b92aea13f1bd5831d77e083e65b9a3a7b4f465d45be9a421785f121b1347f7d: Status 404 returned error can't find the container with id 8b92aea13f1bd5831d77e083e65b9a3a7b4f465d45be9a421785f121b1347f7d Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.403223 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:36 crc kubenswrapper[4704]: E1125 15:37:36.404578 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:36.904531063 +0000 UTC m=+143.172804844 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.505269 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:36 crc kubenswrapper[4704]: E1125 15:37:36.505767 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:37.005750854 +0000 UTC m=+143.274024635 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.596103 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4h6ww"] Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.606769 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:36 crc kubenswrapper[4704]: E1125 15:37:36.607411 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:37.107384427 +0000 UTC m=+143.375658208 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.709541 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:36 crc kubenswrapper[4704]: E1125 15:37:36.710058 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:37.210045102 +0000 UTC m=+143.478318873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:36 crc kubenswrapper[4704]: W1125 15:37:36.793268 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c7e3eb0_06b1_4391_9685_713da13f5bd1.slice/crio-9c553e00f0cbc3884a2be27e8f255ff045189185b533220ad00e8faa008fd041 WatchSource:0}: Error finding container 9c553e00f0cbc3884a2be27e8f255ff045189185b533220ad00e8faa008fd041: Status 404 returned error can't find the container with id 9c553e00f0cbc3884a2be27e8f255ff045189185b533220ad00e8faa008fd041 Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.811807 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:36 crc kubenswrapper[4704]: E1125 15:37:36.812299 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:37.312282084 +0000 UTC m=+143.580555855 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.919240 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vsp9t"] Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.921286 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:36 crc kubenswrapper[4704]: E1125 15:37:36.921751 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:37.421734966 +0000 UTC m=+143.690008747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:36 crc kubenswrapper[4704]: I1125 15:37:36.961448 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401410-8g42t"] Nov 25 15:37:36 crc kubenswrapper[4704]: W1125 15:37:36.988944 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b78cd0f_f99d_4774_bc16_002fd09387ed.slice/crio-10776205c6873604a17de93401dde442110e10b4f8f6859157616691f72ee137 WatchSource:0}: Error finding container 10776205c6873604a17de93401dde442110e10b4f8f6859157616691f72ee137: Status 404 returned error can't find the container with id 10776205c6873604a17de93401dde442110e10b4f8f6859157616691f72ee137 Nov 25 15:37:37 crc kubenswrapper[4704]: E1125 15:37:37.025053 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:37.52502697 +0000 UTC m=+143.793300751 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.024185 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.036820 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:37 crc kubenswrapper[4704]: E1125 15:37:37.037591 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:37.537573905 +0000 UTC m=+143.805847686 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.055760 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7h6hx"] Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.075033 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8df8d"] Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.089350 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kklq9"] Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.093591 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xw9w7"] Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.101872 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tr85d"] Nov 25 15:37:37 crc kubenswrapper[4704]: W1125 15:37:37.123568 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe30d0d8_1a03_4eed_b96c_d010f181d09e.slice/crio-3e51f89669a47ac1b866153a274d445658ce9e93ff5baf78ae7345e4b614107b WatchSource:0}: Error finding container 3e51f89669a47ac1b866153a274d445658ce9e93ff5baf78ae7345e4b614107b: Status 404 returned error can't find the container with id 3e51f89669a47ac1b866153a274d445658ce9e93ff5baf78ae7345e4b614107b Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.138399 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:37 crc kubenswrapper[4704]: E1125 15:37:37.138994 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:37.638970241 +0000 UTC m=+143.907244022 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.192612 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8hvj9"] Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.212626 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-b697g"] Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.222615 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9whr7" event={"ID":"2c7e3eb0-06b1-4391-9685-713da13f5bd1","Type":"ContainerStarted","Data":"9c553e00f0cbc3884a2be27e8f255ff045189185b533220ad00e8faa008fd041"} Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.223669 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tr85d" event={"ID":"62ace4e6-38ce-414a-a549-68671f040e2d","Type":"ContainerStarted","Data":"55d7f940bffc74e5b6923654e0da36dcd7a406e753195e7c20483ce4b37073f4"} Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.224698 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xw9w7" event={"ID":"fe30d0d8-1a03-4eed-b96c-d010f181d09e","Type":"ContainerStarted","Data":"3e51f89669a47ac1b866153a274d445658ce9e93ff5baf78ae7345e4b614107b"} Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.226387 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wj7zx" event={"ID":"5158c6c8-eef4-48f4-82c7-6db821d6894e","Type":"ContainerStarted","Data":"156e5058d1fc451364cea2c3f4948ce79208dc72068685fdbf112cb47ecbff2e"} Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.228117 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kklq9" event={"ID":"b8100f5c-f741-4ef6-b462-d2ce26957517","Type":"ContainerStarted","Data":"1717cad5fd37b03363aa1e02502192b667adc40f7b4090ab6ac74efff3b88637"} Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.229566 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vsp9t" event={"ID":"3b78cd0f-f99d-4774-bc16-002fd09387ed","Type":"ContainerStarted","Data":"10776205c6873604a17de93401dde442110e10b4f8f6859157616691f72ee137"} Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.232952 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" event={"ID":"4ce56dcb-a916-41ca-b706-df5e157576eb","Type":"ContainerStarted","Data":"5319d1cc3d02a76721dc0a8427195186872b40d7749d0d58fcb409e37e41d813"} Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.233962 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7h6hx" event={"ID":"ea3a5b0d-7617-4ff6-8170-b6d692474a2a","Type":"ContainerStarted","Data":"53bead5be48de33d610551467ca4040c0e713368e06247be2f5dd568815f7860"} Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.238653 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7q5r8" event={"ID":"42554b00-c5ca-41d5-b84e-af36e56239c6","Type":"ContainerStarted","Data":"d898674931280a3375c8308f7d7dc21895c4a119bb772ed9af4c0b9dbf81acc2"} Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.240094 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:37 crc kubenswrapper[4704]: E1125 15:37:37.240633 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:37.740592834 +0000 UTC m=+144.008866785 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.243917 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xqbjw" event={"ID":"01031250-19c0-4447-890a-ead82e3257ff","Type":"ContainerStarted","Data":"8b92aea13f1bd5831d77e083e65b9a3a7b4f465d45be9a421785f121b1347f7d"} Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.245579 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4h6ww" event={"ID":"20d0ae75-5631-4169-98bc-92333964e503","Type":"ContainerStarted","Data":"42f3f70c6f41c86d06e103dd295086aab47d840be878c2c195ae1b562d1eaf20"} Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.247214 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s" event={"ID":"49cc56f5-b9bb-4694-8965-0e7c6a4aaae6","Type":"ContainerStarted","Data":"a6281e5f57d6109c78ed7a3af74e9dc80ab0e1e2090aa8f90a37744dabd523f7"} Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.250234 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-jrgmq" event={"ID":"35b36add-59c3-4cb4-940e-76535d4d7479","Type":"ContainerStarted","Data":"98421455b7493df6b0393f0975792e0a01c5fd4fa7416919e573fdf682243592"} Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.252555 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qxcj2" event={"ID":"6b7507cc-285f-4fa5-b478-ad6a31a6855c","Type":"ContainerStarted","Data":"4c6b4ec0f0080ec4532555108997c2786f26fcfe90bf419e739f8f71057267b4"} Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.257245 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8hvj9" event={"ID":"6193bcc6-1da4-414c-84df-92b1bead0762","Type":"ContainerStarted","Data":"cb3481ba0ac7a4226998041b61a6cd551caf3fe5d08ed44535a683ad1d2caf4f"} Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.258761 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gptmb" event={"ID":"0d6182fe-6ca6-4384-99ce-501079ac58ad","Type":"ContainerStarted","Data":"9572e8cc2c660a27bf781bf770861541e7cb9b49cab7324672749e044a5839cd"} Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.259812 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-8g42t" event={"ID":"38b9ead9-033f-44cb-9657-6a078bed2c0d","Type":"ContainerStarted","Data":"b971bccd687ebc803beba0b3d32c8b57b1fbf488f2b50765c372355d5971615a"} Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.262026 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mm4pt" event={"ID":"a37f016b-da0c-4046-9dbe-c4ce4eb4fcfc","Type":"ContainerStarted","Data":"94617bc770618941ab302e55afbb8cd00bb71e7d6a2da3b0711c72025cc347b9"} Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.263097 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8df8d" event={"ID":"7f97ced7-4114-4d79-be3a-8f419ae80727","Type":"ContainerStarted","Data":"795ef3c6990f1a951cc28966a6a66c999a50218774d95d81333c01826f557b5d"} Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.266031 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-rxdvg" event={"ID":"0505a32a-4fec-40b4-a7ca-e4057a223101","Type":"ContainerStarted","Data":"d91734e371c03b8e2fbd059d4de83b9d33bb528f733b42edd94e788f7ae22bcc"} Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.266063 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.267475 4704 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-5llzt container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.6:6443/healthz\": dial tcp 10.217.0.6:6443: connect: connection refused" start-of-body= Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.267532 4704 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" podUID="17320021-32dc-4bef-befa-fa0a7c2b8533" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.6:6443/healthz\": dial tcp 10.217.0.6:6443: connect: connection refused" Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.330902 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mc6bz"] Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.340930 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:37 crc kubenswrapper[4704]: E1125 15:37:37.341088 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:37.841066201 +0000 UTC m=+144.109339982 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.341347 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:37 crc kubenswrapper[4704]: E1125 15:37:37.341713 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:37.841702841 +0000 UTC m=+144.109976622 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.427105 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r24mh"] Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.432557 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9l555"] Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.442012 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vtk75"] Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.442539 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:37 crc kubenswrapper[4704]: E1125 15:37:37.445808 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:37.945763458 +0000 UTC m=+144.214037399 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.478122 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fz52t"] Nov 25 15:37:37 crc kubenswrapper[4704]: W1125 15:37:37.536002 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2790e4ce_dfef_45f3_bfe4_fd6c7d63d948.slice/crio-e6693924e3ff0edcd057e856fbc8bc2c8c8b49320e999eec2832db05fced1509 WatchSource:0}: Error finding container e6693924e3ff0edcd057e856fbc8bc2c8c8b49320e999eec2832db05fced1509: Status 404 returned error can't find the container with id e6693924e3ff0edcd057e856fbc8bc2c8c8b49320e999eec2832db05fced1509 Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.544199 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:37 crc kubenswrapper[4704]: E1125 15:37:37.544917 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:38.044901425 +0000 UTC m=+144.313175206 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.644777 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" podStartSLOduration=120.644752514 podStartE2EDuration="2m0.644752514s" podCreationTimestamp="2025-11-25 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:37.610443533 +0000 UTC m=+143.878717344" watchObservedRunningTime="2025-11-25 15:37:37.644752514 +0000 UTC m=+143.913026295" Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.645060 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:37 crc kubenswrapper[4704]: E1125 15:37:37.645337 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:38.145310111 +0000 UTC m=+144.413583912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.645402 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:37 crc kubenswrapper[4704]: E1125 15:37:37.646143 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:38.146123156 +0000 UTC m=+144.414396937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.704722 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zvh6p" podStartSLOduration=120.70470237 podStartE2EDuration="2m0.70470237s" podCreationTimestamp="2025-11-25 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:37.643488565 +0000 UTC m=+143.911762346" watchObservedRunningTime="2025-11-25 15:37:37.70470237 +0000 UTC m=+143.972976151" Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.747025 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:37 crc kubenswrapper[4704]: E1125 15:37:37.747258 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:38.247228663 +0000 UTC m=+144.515502454 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.747923 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:37 crc kubenswrapper[4704]: E1125 15:37:37.748277 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:38.248268145 +0000 UTC m=+144.516541926 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.849806 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:37 crc kubenswrapper[4704]: E1125 15:37:37.850062 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:38.350044472 +0000 UTC m=+144.618318253 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.850152 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:37 crc kubenswrapper[4704]: E1125 15:37:37.850429 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:38.350420854 +0000 UTC m=+144.618694635 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:37 crc kubenswrapper[4704]: I1125 15:37:37.957668 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:37 crc kubenswrapper[4704]: E1125 15:37:37.958619 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:38.458596457 +0000 UTC m=+144.726870238 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:38 crc kubenswrapper[4704]: I1125 15:37:38.060116 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:38 crc kubenswrapper[4704]: E1125 15:37:38.060606 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:38.560589332 +0000 UTC m=+144.828863113 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:38 crc kubenswrapper[4704]: I1125 15:37:38.161483 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:38 crc kubenswrapper[4704]: E1125 15:37:38.161974 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:38.661709219 +0000 UTC m=+144.929983000 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:38 crc kubenswrapper[4704]: I1125 15:37:38.263352 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:38 crc kubenswrapper[4704]: E1125 15:37:38.263744 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:38.763726064 +0000 UTC m=+145.031999845 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:38 crc kubenswrapper[4704]: I1125 15:37:38.276385 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mc6bz" event={"ID":"0841ffab-5cab-4009-8b15-bbab0863a3be","Type":"ContainerStarted","Data":"64100656a13a2d491badc2e2d3f641470c20b02d0feac8abbdba1e52fd64b61d"} Nov 25 15:37:38 crc kubenswrapper[4704]: I1125 15:37:38.278372 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d8lwt" event={"ID":"2b2cf6a0-7242-42e1-a1a2-2c76cfc7bcd9","Type":"ContainerStarted","Data":"02bf7f7a2083c3ad9feffa2633853d0991fdcf4f4730809520c1eae3eb7590f8"} Nov 25 15:37:38 crc kubenswrapper[4704]: I1125 15:37:38.279701 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9l555" event={"ID":"a06eb3cb-5f1a-4daa-aa00-abb271c35ba1","Type":"ContainerStarted","Data":"597a596a28b891ea3eca439999976bc13ad8589e765df88d2de1dca3a18ebd23"} Nov 25 15:37:38 crc kubenswrapper[4704]: I1125 15:37:38.280487 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r24mh" event={"ID":"034a80a3-c744-464f-a544-2f4f87ad98ed","Type":"ContainerStarted","Data":"086afb05338aa3b59033fccc43bfb55909c8eaaea49216145fa9bb73bb3c41fd"} Nov 25 15:37:38 crc kubenswrapper[4704]: I1125 15:37:38.281922 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-b697g" event={"ID":"66e027ec-5c2f-4314-8572-052d7202f17c","Type":"ContainerStarted","Data":"0393d43e3ab8dcdb246413529b565849eefc61908164cd751365de217529efb2"} Nov 25 15:37:38 crc kubenswrapper[4704]: I1125 15:37:38.285506 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vtk75" event={"ID":"2790e4ce-dfef-45f3-bfe4-fd6c7d63d948","Type":"ContainerStarted","Data":"e6693924e3ff0edcd057e856fbc8bc2c8c8b49320e999eec2832db05fced1509"} Nov 25 15:37:38 crc kubenswrapper[4704]: I1125 15:37:38.287298 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fz52t" event={"ID":"7b416e1d-da7d-4da7-9bae-210c815d4cf1","Type":"ContainerStarted","Data":"b5b085f39e37a8864b2d0505d3e0c38096e2f08fffba64455d2c49bee8ee8fa9"} Nov 25 15:37:38 crc kubenswrapper[4704]: I1125 15:37:38.289457 4704 generic.go:334] "Generic (PLEG): container finished" podID="5d946af5-4a4e-476e-ad32-3eae6ad6c8f7" containerID="1b9e0d5a7e5ac5c29f2134fa32c9a2151392860b1b1efeae3e4497c41c8d3b44" exitCode=0 Nov 25 15:37:38 crc kubenswrapper[4704]: I1125 15:37:38.289526 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-z692d" event={"ID":"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7","Type":"ContainerDied","Data":"1b9e0d5a7e5ac5c29f2134fa32c9a2151392860b1b1efeae3e4497c41c8d3b44"} Nov 25 15:37:38 crc kubenswrapper[4704]: I1125 15:37:38.291331 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h7k8v" event={"ID":"6a92f740-f168-4d3b-b225-a73109091d7d","Type":"ContainerStarted","Data":"bae90ebd6aff1ac62381cba69839668005a2ba0e673f683fc9b2c71be176b318"} Nov 25 15:37:38 crc kubenswrapper[4704]: I1125 15:37:38.292149 4704 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-5llzt container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.6:6443/healthz\": dial tcp 10.217.0.6:6443: connect: connection refused" start-of-body= Nov 25 15:37:38 crc kubenswrapper[4704]: I1125 15:37:38.292190 4704 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" podUID="17320021-32dc-4bef-befa-fa0a7c2b8533" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.6:6443/healthz\": dial tcp 10.217.0.6:6443: connect: connection refused" Nov 25 15:37:38 crc kubenswrapper[4704]: I1125 15:37:38.319864 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-7pmpw" podStartSLOduration=121.319822883 podStartE2EDuration="2m1.319822883s" podCreationTimestamp="2025-11-25 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:38.31583864 +0000 UTC m=+144.584112441" watchObservedRunningTime="2025-11-25 15:37:38.319822883 +0000 UTC m=+144.588096684" Nov 25 15:37:38 crc kubenswrapper[4704]: I1125 15:37:38.365105 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:38 crc kubenswrapper[4704]: E1125 15:37:38.365491 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:38.865474921 +0000 UTC m=+145.133748702 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:38 crc kubenswrapper[4704]: I1125 15:37:38.466553 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:38 crc kubenswrapper[4704]: E1125 15:37:38.467885 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:38.967866207 +0000 UTC m=+145.236140188 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:38 crc kubenswrapper[4704]: I1125 15:37:38.569058 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:38 crc kubenswrapper[4704]: E1125 15:37:38.569443 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:39.069398628 +0000 UTC m=+145.337672419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:38 crc kubenswrapper[4704]: I1125 15:37:38.569613 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:38 crc kubenswrapper[4704]: E1125 15:37:38.570044 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:39.070027067 +0000 UTC m=+145.338300848 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:38 crc kubenswrapper[4704]: I1125 15:37:38.671169 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:38 crc kubenswrapper[4704]: E1125 15:37:38.671350 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:39.17132425 +0000 UTC m=+145.439598031 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:38 crc kubenswrapper[4704]: I1125 15:37:38.672105 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:38 crc kubenswrapper[4704]: E1125 15:37:38.672675 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:39.17264731 +0000 UTC m=+145.440921281 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:38 crc kubenswrapper[4704]: I1125 15:37:38.773113 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:38 crc kubenswrapper[4704]: E1125 15:37:38.773299 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:39.273264342 +0000 UTC m=+145.541538123 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:38 crc kubenswrapper[4704]: I1125 15:37:38.773395 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:38 crc kubenswrapper[4704]: E1125 15:37:38.773771 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:39.273758367 +0000 UTC m=+145.542032148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:38 crc kubenswrapper[4704]: I1125 15:37:38.874485 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:38 crc kubenswrapper[4704]: E1125 15:37:38.874701 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:39.374664918 +0000 UTC m=+145.642938709 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:38 crc kubenswrapper[4704]: I1125 15:37:38.874964 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:38 crc kubenswrapper[4704]: E1125 15:37:38.875334 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:39.375317408 +0000 UTC m=+145.643591189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:38 crc kubenswrapper[4704]: I1125 15:37:38.976562 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:38 crc kubenswrapper[4704]: E1125 15:37:38.976688 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:39.476667503 +0000 UTC m=+145.744941284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:38 crc kubenswrapper[4704]: I1125 15:37:38.976959 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:38 crc kubenswrapper[4704]: E1125 15:37:38.977274 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:39.477266141 +0000 UTC m=+145.745539922 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.077754 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:39 crc kubenswrapper[4704]: E1125 15:37:39.078336 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:39.578316977 +0000 UTC m=+145.846590758 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.180110 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:39 crc kubenswrapper[4704]: E1125 15:37:39.180686 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:39.680667742 +0000 UTC m=+145.948941523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.281861 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:39 crc kubenswrapper[4704]: E1125 15:37:39.282268 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:39.782245263 +0000 UTC m=+146.050519044 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.298072 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tr85d" event={"ID":"62ace4e6-38ce-414a-a549-68671f040e2d","Type":"ContainerStarted","Data":"92cd6acc82185b3c35b79979dea01c5e1218e0437352a7809bd54296702b45a6"} Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.299664 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gff7t" event={"ID":"dd45c23d-4eaf-40d9-a735-eb804c875a59","Type":"ContainerStarted","Data":"3083f3c37b7f6a366e382941aca87a8830cbedc1a09ce5ea8e12680ed30f8d15"} Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.301137 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wj7zx" event={"ID":"5158c6c8-eef4-48f4-82c7-6db821d6894e","Type":"ContainerStarted","Data":"f78172ef0414e5124135fb852b16068fe8f21ea8ccf61c7faeeba2719f660a3f"} Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.302868 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mm4pt" event={"ID":"a37f016b-da0c-4046-9dbe-c4ce4eb4fcfc","Type":"ContainerStarted","Data":"62d4e7dca18c8a2176400471e66d7e54eb0b7a3cd51380843b513f61bdf9b2e5"} Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.303934 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8hvj9" event={"ID":"6193bcc6-1da4-414c-84df-92b1bead0762","Type":"ContainerStarted","Data":"1a6d289ac1aa1275bf04bb68f32c952870c145ca1dff24f829e7be3e291711e4"} Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.305589 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qxcj2" event={"ID":"6b7507cc-285f-4fa5-b478-ad6a31a6855c","Type":"ContainerStarted","Data":"55ee11649b7f1f768cad1ece49e19fe5ff079e6cd9b32b7676605bd317ef84b0"} Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.306893 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8df8d" event={"ID":"7f97ced7-4114-4d79-be3a-8f419ae80727","Type":"ContainerStarted","Data":"ad747f63a7e50f5729105dd4ddeb5731e944a6034757731d33c03a5b109f98f4"} Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.308474 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vsp9t" event={"ID":"3b78cd0f-f99d-4774-bc16-002fd09387ed","Type":"ContainerStarted","Data":"311f6817e3ac036f4c494838e16c8f4054563b2845204824e1078d4794bdfac6"} Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.310501 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kklq9" event={"ID":"b8100f5c-f741-4ef6-b462-d2ce26957517","Type":"ContainerStarted","Data":"4733ce3cedd7084d35dc7599f31f798a35094a0560479a97b587a43a858f8dec"} Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.312199 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qxl8w" event={"ID":"fff4a5ac-b41a-4c64-b448-5a687e16e9cd","Type":"ContainerStarted","Data":"09191aa624beb4c790a97d09f01626575ee53aa37226d028feb56019998c8252"} Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.313481 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-nd74n" event={"ID":"5250fdec-a063-483a-9bd2-b4e11479c232","Type":"ContainerStarted","Data":"209064fbfc25f2b84a7b45e711fef89de83398c5c143653deef991e70800c021"} Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.314751 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9whr7" event={"ID":"2c7e3eb0-06b1-4391-9685-713da13f5bd1","Type":"ContainerStarted","Data":"967831ddd7758b352cdf289835e87c53367d158898933a55b3b9632b1858cb06"} Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.317284 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-p5r8t" event={"ID":"ebac8a19-3d09-41ed-99c0-061a54e7e6ec","Type":"ContainerStarted","Data":"5838ddfefa3c238702763a9e21bccddb0664603d5975478db45a11cd481e73d4"} Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.318643 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-8g42t" event={"ID":"38b9ead9-033f-44cb-9657-6a078bed2c0d","Type":"ContainerStarted","Data":"535b6f8581345683bf90ded26a97ff2e554d4df43f8222991d8cb64e5d622052"} Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.320079 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tbhnz" event={"ID":"0ed8b5be-28a8-4dcf-ad65-16d392570684","Type":"ContainerStarted","Data":"debacb8a390858bdac859597e7c4871ad3cdc0d20a36223311ad7f7be53ee1c0"} Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.321557 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xw9w7" event={"ID":"fe30d0d8-1a03-4eed-b96c-d010f181d09e","Type":"ContainerStarted","Data":"6f2e4aed465bcbd55d2cb07e9a5ba2509817bf2a5f5bd08c1ef18afaf6ad2223"} Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.323186 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-jrgmq" event={"ID":"35b36add-59c3-4cb4-940e-76535d4d7479","Type":"ContainerStarted","Data":"a9ab0b41cfda0b99966d7d3a21bd4b3215ad37f54a0c01a4cb3c8ecffd193dc3"} Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.324504 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xqbjw" event={"ID":"01031250-19c0-4447-890a-ead82e3257ff","Type":"ContainerStarted","Data":"f1d000dbb10aaeeb653ec1f497b8efd5b5e84e3158a6c2b4d5b11bb6808c5d96"} Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.325626 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-rxdvg" event={"ID":"0505a32a-4fec-40b4-a7ca-e4057a223101","Type":"ContainerStarted","Data":"78e7dfe4f59659e36c47318a53e7306d4dddb1900b6d0408b64efc7aade94cdb"} Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.327011 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c7s9c" event={"ID":"4b54a115-ed61-47a7-b447-400aa4f75b1b","Type":"ContainerStarted","Data":"8e0d3cc7923de837ad5b441b3218b9ae987b73af8157871eb7a3461bd4614aae"} Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.328863 4704 generic.go:334] "Generic (PLEG): container finished" podID="2b9c93c0-005e-4b54-a498-a4ae8418f839" containerID="b1ae729133e79690f8d8d25be80c2a84b6c14e46b5aaa13f84af3b63339b4702" exitCode=0 Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.328919 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" event={"ID":"2b9c93c0-005e-4b54-a498-a4ae8418f839","Type":"ContainerDied","Data":"b1ae729133e79690f8d8d25be80c2a84b6c14e46b5aaa13f84af3b63339b4702"} Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.330383 4704 generic.go:334] "Generic (PLEG): container finished" podID="42554b00-c5ca-41d5-b84e-af36e56239c6" containerID="d898674931280a3375c8308f7d7dc21895c4a119bb772ed9af4c0b9dbf81acc2" exitCode=0 Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.330454 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7q5r8" event={"ID":"42554b00-c5ca-41d5-b84e-af36e56239c6","Type":"ContainerDied","Data":"d898674931280a3375c8308f7d7dc21895c4a119bb772ed9af4c0b9dbf81acc2"} Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.332258 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4h6ww" event={"ID":"20d0ae75-5631-4169-98bc-92333964e503","Type":"ContainerStarted","Data":"5753d85406ba0c83dfe88f942fcd63480697880330f737ae15ead6a588b93a96"} Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.332939 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.335186 4704 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-gc5rd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.335250 4704 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" podUID="4ce56dcb-a916-41ca-b706-df5e157576eb" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.384494 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:39 crc kubenswrapper[4704]: E1125 15:37:39.385264 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:39.885243828 +0000 UTC m=+146.153517609 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.388633 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gptmb" podStartSLOduration=122.388613712 podStartE2EDuration="2m2.388613712s" podCreationTimestamp="2025-11-25 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:39.369579769 +0000 UTC m=+145.637853550" watchObservedRunningTime="2025-11-25 15:37:39.388613712 +0000 UTC m=+145.656887493" Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.390425 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" podStartSLOduration=122.390418977 podStartE2EDuration="2m2.390418977s" podCreationTimestamp="2025-11-25 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:39.387005752 +0000 UTC m=+145.655279533" watchObservedRunningTime="2025-11-25 15:37:39.390418977 +0000 UTC m=+145.658692758" Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.426434 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-gbzgh" podStartSLOduration=122.42641659 podStartE2EDuration="2m2.42641659s" podCreationTimestamp="2025-11-25 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:39.423265693 +0000 UTC m=+145.691539474" watchObservedRunningTime="2025-11-25 15:37:39.42641659 +0000 UTC m=+145.694690371" Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.440385 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-pjf9j" podStartSLOduration=122.440356147 podStartE2EDuration="2m2.440356147s" podCreationTimestamp="2025-11-25 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:39.439189971 +0000 UTC m=+145.707463752" watchObservedRunningTime="2025-11-25 15:37:39.440356147 +0000 UTC m=+145.708629918" Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.456922 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s" podStartSLOduration=121.456890503 podStartE2EDuration="2m1.456890503s" podCreationTimestamp="2025-11-25 15:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:39.456691917 +0000 UTC m=+145.724965698" watchObservedRunningTime="2025-11-25 15:37:39.456890503 +0000 UTC m=+145.725164294" Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.485783 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:39 crc kubenswrapper[4704]: E1125 15:37:39.487505 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:39.98745992 +0000 UTC m=+146.255733711 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.490810 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:39 crc kubenswrapper[4704]: E1125 15:37:39.492420 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:39.992394891 +0000 UTC m=+146.260668862 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.592857 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:39 crc kubenswrapper[4704]: E1125 15:37:39.593143 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:40.093098346 +0000 UTC m=+146.361372127 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.593370 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:39 crc kubenswrapper[4704]: E1125 15:37:39.593862 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:40.093843298 +0000 UTC m=+146.362117089 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.695473 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:39 crc kubenswrapper[4704]: E1125 15:37:39.695708 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:40.195675128 +0000 UTC m=+146.463948909 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.695875 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:39 crc kubenswrapper[4704]: E1125 15:37:39.696312 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:40.196303007 +0000 UTC m=+146.464576788 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.797217 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:39 crc kubenswrapper[4704]: E1125 15:37:39.797446 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:40.297413614 +0000 UTC m=+146.565687395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.797907 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:39 crc kubenswrapper[4704]: E1125 15:37:39.798254 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:40.298242559 +0000 UTC m=+146.566516340 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.899220 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:39 crc kubenswrapper[4704]: E1125 15:37:39.899428 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:40.399390317 +0000 UTC m=+146.667664108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:39 crc kubenswrapper[4704]: I1125 15:37:39.899688 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:39 crc kubenswrapper[4704]: E1125 15:37:39.900068 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:40.400054987 +0000 UTC m=+146.668328768 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.000606 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:40 crc kubenswrapper[4704]: E1125 15:37:40.000985 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:40.500968158 +0000 UTC m=+146.769241939 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.102207 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:40 crc kubenswrapper[4704]: E1125 15:37:40.102550 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:40.602531449 +0000 UTC m=+146.870805230 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.204235 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:40 crc kubenswrapper[4704]: E1125 15:37:40.204380 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:40.704357809 +0000 UTC m=+146.972631600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.204516 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:40 crc kubenswrapper[4704]: E1125 15:37:40.204898 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:40.704886925 +0000 UTC m=+146.973160706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.305527 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:40 crc kubenswrapper[4704]: E1125 15:37:40.305711 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:40.805686082 +0000 UTC m=+147.073959863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.306334 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:40 crc kubenswrapper[4704]: E1125 15:37:40.306717 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:40.806706574 +0000 UTC m=+147.074980525 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.352292 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9l555" event={"ID":"a06eb3cb-5f1a-4daa-aa00-abb271c35ba1","Type":"ContainerStarted","Data":"7a06433aba857b0d8b8cdbe76439b4a243ceec7d9924cc1d951ded41eef330d4"} Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.353943 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7h6hx" event={"ID":"ea3a5b0d-7617-4ff6-8170-b6d692474a2a","Type":"ContainerStarted","Data":"4387589853f238623840fa1c2dbf7d2b41f63ab1b096c16a6cee2d1bfa918745"} Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.354886 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vtk75" event={"ID":"2790e4ce-dfef-45f3-bfe4-fd6c7d63d948","Type":"ContainerStarted","Data":"f72a76a4a1714b02bfba556e12d2c7686fea30e373692cfa53b5135d51063674"} Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.361910 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r24mh" event={"ID":"034a80a3-c744-464f-a544-2f4f87ad98ed","Type":"ContainerStarted","Data":"1387bb9d896527d7ba865af1683a29f60b77e41605d50c5ad713d0c27153f3d0"} Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.375314 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vtk75" podStartSLOduration=122.375286124 podStartE2EDuration="2m2.375286124s" podCreationTimestamp="2025-11-25 15:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:40.371912211 +0000 UTC m=+146.640186002" watchObservedRunningTime="2025-11-25 15:37:40.375286124 +0000 UTC m=+146.643559915" Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.377677 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fz52t" event={"ID":"7b416e1d-da7d-4da7-9bae-210c815d4cf1","Type":"ContainerStarted","Data":"7094070cba1166208b766c21b46e21c284ec0e559d6fccc838fb1445b6514ea3"} Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.383454 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mc6bz" event={"ID":"0841ffab-5cab-4009-8b15-bbab0863a3be","Type":"ContainerStarted","Data":"06986ecd2b4a7d6cf2c5c2690846dbf9a72e550fb990249a9a7ee6f064b6988f"} Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.397756 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h7k8v" event={"ID":"6a92f740-f168-4d3b-b225-a73109091d7d","Type":"ContainerStarted","Data":"06088518b9f7d170fdaf7408a434f7bc59f673b992ca29b336580d4ca4c1b45b"} Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.398337 4704 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-gc5rd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.398470 4704 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" podUID="4ce56dcb-a916-41ca-b706-df5e157576eb" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.399392 4704 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8hvj9 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" start-of-body= Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.399443 4704 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8hvj9" podUID="6193bcc6-1da4-414c-84df-92b1bead0762" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.399724 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8hvj9" Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.407632 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:40 crc kubenswrapper[4704]: E1125 15:37:40.407927 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:40.907903693 +0000 UTC m=+147.176177474 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.408012 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:40 crc kubenswrapper[4704]: E1125 15:37:40.408982 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:40.908963066 +0000 UTC m=+147.177236847 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.443646 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xw9w7" podStartSLOduration=122.443622448 podStartE2EDuration="2m2.443622448s" podCreationTimestamp="2025-11-25 15:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:40.420122598 +0000 UTC m=+146.688396379" watchObservedRunningTime="2025-11-25 15:37:40.443622448 +0000 UTC m=+146.711896229" Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.446247 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-8g42t" podStartSLOduration=123.446234988 podStartE2EDuration="2m3.446234988s" podCreationTimestamp="2025-11-25 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:40.444548406 +0000 UTC m=+146.712822197" watchObservedRunningTime="2025-11-25 15:37:40.446234988 +0000 UTC m=+146.714508769" Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.476244 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-9whr7" podStartSLOduration=123.476221606 podStartE2EDuration="2m3.476221606s" podCreationTimestamp="2025-11-25 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:40.475135613 +0000 UTC m=+146.743409414" watchObservedRunningTime="2025-11-25 15:37:40.476221606 +0000 UTC m=+146.744495387" Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.496818 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8hvj9" podStartSLOduration=122.496746735 podStartE2EDuration="2m2.496746735s" podCreationTimestamp="2025-11-25 15:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:40.491322129 +0000 UTC m=+146.759595910" watchObservedRunningTime="2025-11-25 15:37:40.496746735 +0000 UTC m=+146.765020536" Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.518681 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tbhnz" podStartSLOduration=122.518658396 podStartE2EDuration="2m2.518658396s" podCreationTimestamp="2025-11-25 15:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:40.517259493 +0000 UTC m=+146.785533264" watchObservedRunningTime="2025-11-25 15:37:40.518658396 +0000 UTC m=+146.786932187" Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.522784 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:40 crc kubenswrapper[4704]: E1125 15:37:40.524409 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:41.024376161 +0000 UTC m=+147.292649942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.545663 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qxcj2" podStartSLOduration=7.545634602 podStartE2EDuration="7.545634602s" podCreationTimestamp="2025-11-25 15:37:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:40.540835655 +0000 UTC m=+146.809109446" watchObservedRunningTime="2025-11-25 15:37:40.545634602 +0000 UTC m=+146.813908383" Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.570248 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d8lwt" podStartSLOduration=122.570221376 podStartE2EDuration="2m2.570221376s" podCreationTimestamp="2025-11-25 15:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:40.57003742 +0000 UTC m=+146.838311221" watchObservedRunningTime="2025-11-25 15:37:40.570221376 +0000 UTC m=+146.838495157" Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.616521 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-p5r8t" podStartSLOduration=123.616502183 podStartE2EDuration="2m3.616502183s" podCreationTimestamp="2025-11-25 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:40.613903804 +0000 UTC m=+146.882177615" watchObservedRunningTime="2025-11-25 15:37:40.616502183 +0000 UTC m=+146.884775964" Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.625542 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:40 crc kubenswrapper[4704]: E1125 15:37:40.625968 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:41.125950453 +0000 UTC m=+147.394224234 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.651038 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4h6ww" podStartSLOduration=122.65101299 podStartE2EDuration="2m2.65101299s" podCreationTimestamp="2025-11-25 15:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:40.648860984 +0000 UTC m=+146.917134765" watchObservedRunningTime="2025-11-25 15:37:40.65101299 +0000 UTC m=+146.919286771" Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.695061 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xqbjw" podStartSLOduration=122.695038229 podStartE2EDuration="2m2.695038229s" podCreationTimestamp="2025-11-25 15:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:40.692720548 +0000 UTC m=+146.960994329" watchObservedRunningTime="2025-11-25 15:37:40.695038229 +0000 UTC m=+146.963312010" Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.710928 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-rxdvg" podStartSLOduration=8.710883214 podStartE2EDuration="8.710883214s" podCreationTimestamp="2025-11-25 15:37:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:40.709368868 +0000 UTC m=+146.977642639" watchObservedRunningTime="2025-11-25 15:37:40.710883214 +0000 UTC m=+146.979156995" Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.728407 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:40 crc kubenswrapper[4704]: E1125 15:37:40.728806 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:41.228758732 +0000 UTC m=+147.497032513 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.732648 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kklq9" podStartSLOduration=122.73262833 podStartE2EDuration="2m2.73262833s" podCreationTimestamp="2025-11-25 15:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:40.73260016 +0000 UTC m=+147.000873951" watchObservedRunningTime="2025-11-25 15:37:40.73262833 +0000 UTC m=+147.000902111" Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.754934 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-nd74n" podStartSLOduration=123.754918223 podStartE2EDuration="2m3.754918223s" podCreationTimestamp="2025-11-25 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:40.75218744 +0000 UTC m=+147.020461221" watchObservedRunningTime="2025-11-25 15:37:40.754918223 +0000 UTC m=+147.023192004" Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.776891 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wj7zx" podStartSLOduration=123.776871206 podStartE2EDuration="2m3.776871206s" podCreationTimestamp="2025-11-25 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:40.776593947 +0000 UTC m=+147.044867728" watchObservedRunningTime="2025-11-25 15:37:40.776871206 +0000 UTC m=+147.045144997" Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.829827 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:40 crc kubenswrapper[4704]: E1125 15:37:40.830170 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:41.330157708 +0000 UTC m=+147.598431489 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.931055 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:40 crc kubenswrapper[4704]: E1125 15:37:40.931511 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:41.431489722 +0000 UTC m=+147.699763503 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:40 crc kubenswrapper[4704]: I1125 15:37:40.931666 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:40 crc kubenswrapper[4704]: E1125 15:37:40.932175 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:41.432167583 +0000 UTC m=+147.700441364 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.033525 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:41 crc kubenswrapper[4704]: E1125 15:37:41.034312 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:41.534288701 +0000 UTC m=+147.802562482 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.052588 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-9whr7" Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.054946 4704 patch_prober.go:28] interesting pod/router-default-5444994796-9whr7 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.055041 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9whr7" podUID="2c7e3eb0-06b1-4391-9685-713da13f5bd1" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.135926 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:41 crc kubenswrapper[4704]: E1125 15:37:41.136452 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:41.636425479 +0000 UTC m=+147.904699260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.237521 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:41 crc kubenswrapper[4704]: E1125 15:37:41.237757 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:41.737726943 +0000 UTC m=+148.006000724 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.237920 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:41 crc kubenswrapper[4704]: E1125 15:37:41.238234 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:41.738222538 +0000 UTC m=+148.006496319 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.339387 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:41 crc kubenswrapper[4704]: E1125 15:37:41.339641 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:41.839610683 +0000 UTC m=+148.107884464 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.340191 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:41 crc kubenswrapper[4704]: E1125 15:37:41.340624 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:41.840608534 +0000 UTC m=+148.108882315 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.407806 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7q5r8" event={"ID":"42554b00-c5ca-41d5-b84e-af36e56239c6","Type":"ContainerStarted","Data":"feecc1e7f3fafcf6335c90ef6889c4be338c94905e27b08f2fcccf16b108e1a5"} Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.410224 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8df8d" event={"ID":"7f97ced7-4114-4d79-be3a-8f419ae80727","Type":"ContainerStarted","Data":"62d98f16d1192bf00f49e9625d4605be168ba4562689e762a3c5fde704da4de8"} Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.411671 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-z692d" event={"ID":"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7","Type":"ContainerStarted","Data":"98ea41a217f4fda045ab0f4a5a27f32c0e7a62fc92da0fa70145998c4a1fd73d"} Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.413294 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c7s9c" event={"ID":"4b54a115-ed61-47a7-b447-400aa4f75b1b","Type":"ContainerStarted","Data":"75bb5ed59fdfa871ce167d5e326822f7f37ff4ffb56da359c454f8b49eb724d0"} Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.414606 4704 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8hvj9 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" start-of-body= Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.414676 4704 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8hvj9" podUID="6193bcc6-1da4-414c-84df-92b1bead0762" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.431366 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-7h6hx" podStartSLOduration=123.431341033 podStartE2EDuration="2m3.431341033s" podCreationTimestamp="2025-11-25 15:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:41.429964401 +0000 UTC m=+147.698238182" watchObservedRunningTime="2025-11-25 15:37:41.431341033 +0000 UTC m=+147.699614824" Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.443655 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h7k8v" podStartSLOduration=124.44363282 podStartE2EDuration="2m4.44363282s" podCreationTimestamp="2025-11-25 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:41.442620749 +0000 UTC m=+147.710894540" watchObservedRunningTime="2025-11-25 15:37:41.44363282 +0000 UTC m=+147.711906601" Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.447928 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:41 crc kubenswrapper[4704]: E1125 15:37:41.448092 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:41.948065526 +0000 UTC m=+148.216339307 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.448642 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:41 crc kubenswrapper[4704]: E1125 15:37:41.450339 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:41.950325685 +0000 UTC m=+148.218599456 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.471407 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-9l555" podStartSLOduration=123.47138511 podStartE2EDuration="2m3.47138511s" podCreationTimestamp="2025-11-25 15:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:41.470482532 +0000 UTC m=+147.738756313" watchObservedRunningTime="2025-11-25 15:37:41.47138511 +0000 UTC m=+147.739658881" Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.550401 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:41 crc kubenswrapper[4704]: E1125 15:37:41.550606 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:42.050561305 +0000 UTC m=+148.318835086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.550871 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:41 crc kubenswrapper[4704]: E1125 15:37:41.551173 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:42.051162844 +0000 UTC m=+148.319436615 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.651767 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:41 crc kubenswrapper[4704]: E1125 15:37:41.651968 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:42.151934751 +0000 UTC m=+148.420208532 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.652161 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:41 crc kubenswrapper[4704]: E1125 15:37:41.652523 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:42.152509018 +0000 UTC m=+148.420782799 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.753838 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:41 crc kubenswrapper[4704]: E1125 15:37:41.754069 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:42.254039828 +0000 UTC m=+148.522313609 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.754333 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:41 crc kubenswrapper[4704]: E1125 15:37:41.754710 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:42.254691258 +0000 UTC m=+148.522965039 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.855144 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:41 crc kubenswrapper[4704]: E1125 15:37:41.855359 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:42.355323251 +0000 UTC m=+148.623597032 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.855503 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:41 crc kubenswrapper[4704]: E1125 15:37:41.856019 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:42.356009882 +0000 UTC m=+148.624283663 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.956531 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:41 crc kubenswrapper[4704]: E1125 15:37:41.956680 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:42.456661715 +0000 UTC m=+148.724935496 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:41 crc kubenswrapper[4704]: I1125 15:37:41.956754 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:41 crc kubenswrapper[4704]: E1125 15:37:41.957143 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:42.457132339 +0000 UTC m=+148.725406120 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.053613 4704 patch_prober.go:28] interesting pod/router-default-5444994796-9whr7 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.053686 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9whr7" podUID="2c7e3eb0-06b1-4391-9685-713da13f5bd1" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.058116 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:42 crc kubenswrapper[4704]: E1125 15:37:42.058352 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:42.558302268 +0000 UTC m=+148.826576049 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.058512 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:42 crc kubenswrapper[4704]: E1125 15:37:42.058955 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:42.558944778 +0000 UTC m=+148.827218559 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.159729 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:42 crc kubenswrapper[4704]: E1125 15:37:42.159856 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:42.659838139 +0000 UTC m=+148.928111920 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.160044 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:42 crc kubenswrapper[4704]: E1125 15:37:42.160380 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:42.660367885 +0000 UTC m=+148.928641666 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.261801 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:42 crc kubenswrapper[4704]: E1125 15:37:42.261964 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:42.761925906 +0000 UTC m=+149.030199687 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.262221 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.262320 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.262361 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:37:42 crc kubenswrapper[4704]: E1125 15:37:42.264089 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:42.764069821 +0000 UTC m=+149.032343602 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.274599 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.315080 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.363953 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:42 crc kubenswrapper[4704]: E1125 15:37:42.364201 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:42.864165178 +0000 UTC m=+149.132438959 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.364281 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.364347 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.364418 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:42 crc kubenswrapper[4704]: E1125 15:37:42.364843 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:42.864833978 +0000 UTC m=+149.133107759 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.368311 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.368320 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.427456 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gff7t" event={"ID":"dd45c23d-4eaf-40d9-a735-eb804c875a59","Type":"ContainerStarted","Data":"e379fd678b62ec44b7747b1355972afad08d815f74fffcbf77af74a3f482cff8"} Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.429462 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mm4pt" event={"ID":"a37f016b-da0c-4046-9dbe-c4ce4eb4fcfc","Type":"ContainerStarted","Data":"af6eb7ebaa1da510001bded25baba47a81a66772a4c6751bb47312f59ffd9cee"} Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.431703 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tr85d" event={"ID":"62ace4e6-38ce-414a-a549-68671f040e2d","Type":"ContainerStarted","Data":"096d730c4bd2d43fad33152895ce768eb3205d0ce77e3e916a405fe585729100"} Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.431746 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7q5r8" Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.442769 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.447609 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.465503 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-r24mh" podStartSLOduration=124.465484331 podStartE2EDuration="2m4.465484331s" podCreationTimestamp="2025-11-25 15:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:41.491646251 +0000 UTC m=+147.759920032" watchObservedRunningTime="2025-11-25 15:37:42.465484331 +0000 UTC m=+148.733758112" Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.467392 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:42 crc kubenswrapper[4704]: E1125 15:37:42.467524 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:42.967502783 +0000 UTC m=+149.235776564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.467691 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:42 crc kubenswrapper[4704]: E1125 15:37:42.468042 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:42.968031069 +0000 UTC m=+149.236304850 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.537082 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.572404 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:42 crc kubenswrapper[4704]: E1125 15:37:42.572579 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:43.072553101 +0000 UTC m=+149.340826892 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.575155 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:42 crc kubenswrapper[4704]: E1125 15:37:42.575628 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:43.075609084 +0000 UTC m=+149.343882865 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.679599 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:42 crc kubenswrapper[4704]: E1125 15:37:42.679905 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:43.179888249 +0000 UTC m=+149.448162030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.703737 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7q5r8" podStartSLOduration=125.703700838 podStartE2EDuration="2m5.703700838s" podCreationTimestamp="2025-11-25 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:42.466804052 +0000 UTC m=+148.735077853" watchObservedRunningTime="2025-11-25 15:37:42.703700838 +0000 UTC m=+148.971974629" Nov 25 15:37:42 crc kubenswrapper[4704]: W1125 15:37:42.726596 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-30121a2418e00cae45020bba57c4ef43528d796e4b6daee4bf3f40a0a0153531 WatchSource:0}: Error finding container 30121a2418e00cae45020bba57c4ef43528d796e4b6daee4bf3f40a0a0153531: Status 404 returned error can't find the container with id 30121a2418e00cae45020bba57c4ef43528d796e4b6daee4bf3f40a0a0153531 Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.780946 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:42 crc kubenswrapper[4704]: E1125 15:37:42.781570 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:43.281547673 +0000 UTC m=+149.549821454 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:42 crc kubenswrapper[4704]: W1125 15:37:42.836678 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-d6a67b8d4b478788050a4deff720e2c675951c5c2aa428c861097469e31277bb WatchSource:0}: Error finding container d6a67b8d4b478788050a4deff720e2c675951c5c2aa428c861097469e31277bb: Status 404 returned error can't find the container with id d6a67b8d4b478788050a4deff720e2c675951c5c2aa428c861097469e31277bb Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.882124 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:42 crc kubenswrapper[4704]: E1125 15:37:42.882318 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:43.382294119 +0000 UTC m=+149.650567900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.882513 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:42 crc kubenswrapper[4704]: E1125 15:37:42.882920 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:43.382912738 +0000 UTC m=+149.651186519 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.984952 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:42 crc kubenswrapper[4704]: E1125 15:37:42.985177 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:43.485146799 +0000 UTC m=+149.753420580 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:42 crc kubenswrapper[4704]: I1125 15:37:42.985245 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:42 crc kubenswrapper[4704]: E1125 15:37:42.985569 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:43.485555762 +0000 UTC m=+149.753829553 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.053703 4704 patch_prober.go:28] interesting pod/router-default-5444994796-9whr7 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.053834 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9whr7" podUID="2c7e3eb0-06b1-4391-9685-713da13f5bd1" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.086747 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:43 crc kubenswrapper[4704]: E1125 15:37:43.086936 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:43.586905447 +0000 UTC m=+149.855179228 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.087057 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:43 crc kubenswrapper[4704]: E1125 15:37:43.087410 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:43.587397332 +0000 UTC m=+149.855671113 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.097923 4704 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-7q5r8 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.098031 4704 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7q5r8" podUID="42554b00-c5ca-41d5-b84e-af36e56239c6" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.098185 4704 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-7q5r8 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.098271 4704 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7q5r8" podUID="42554b00-c5ca-41d5-b84e-af36e56239c6" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.187735 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:43 crc kubenswrapper[4704]: E1125 15:37:43.187998 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:43.687964192 +0000 UTC m=+149.956237973 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.188073 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:43 crc kubenswrapper[4704]: E1125 15:37:43.188535 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:43.688520149 +0000 UTC m=+149.956793930 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.288803 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:43 crc kubenswrapper[4704]: E1125 15:37:43.289029 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:43.788997277 +0000 UTC m=+150.057271058 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.289308 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:43 crc kubenswrapper[4704]: E1125 15:37:43.289836 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:43.789825972 +0000 UTC m=+150.058099763 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.391314 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:43 crc kubenswrapper[4704]: E1125 15:37:43.391556 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:43.891501707 +0000 UTC m=+150.159775488 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.391962 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:43 crc kubenswrapper[4704]: E1125 15:37:43.392369 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:43.892352753 +0000 UTC m=+150.160626534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.437447 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-jrgmq" event={"ID":"35b36add-59c3-4cb4-940e-76535d4d7479","Type":"ContainerStarted","Data":"4ed6e99c9318d1fd2a4db43e74787eed7ea89dc280742977703a40a6076d4b0b"} Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.438985 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b82418255bd1e9fd2d444e4a3b181ff516fdd030d5a4317e839fe46831320b93"} Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.439032 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d6a67b8d4b478788050a4deff720e2c675951c5c2aa428c861097469e31277bb"} Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.440060 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ec3d8aa998510ce398fe9c0f9b658569a0ca0af82ebc94781abd6b025120fae6"} Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.442417 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qxl8w" event={"ID":"fff4a5ac-b41a-4c64-b448-5a687e16e9cd","Type":"ContainerStarted","Data":"95a3820fdad6b189b1030af96357c1923146e0821be0f52955d8be7c6b6b6bd0"} Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.455446 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"30121a2418e00cae45020bba57c4ef43528d796e4b6daee4bf3f40a0a0153531"} Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.457937 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fz52t" event={"ID":"7b416e1d-da7d-4da7-9bae-210c815d4cf1","Type":"ContainerStarted","Data":"b81f003398e1d37f154c2b6f5c16c114cbe2a09593995bd4d59b58969670a295"} Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.460816 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vsp9t" event={"ID":"3b78cd0f-f99d-4774-bc16-002fd09387ed","Type":"ContainerStarted","Data":"12be0f76b41fe9f7cabb7ca0c606642f2b83f4a7c035925cf89076e227bda2a5"} Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.467269 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-jrgmq" podStartSLOduration=125.467248406 podStartE2EDuration="2m5.467248406s" podCreationTimestamp="2025-11-25 15:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:43.466218865 +0000 UTC m=+149.734492646" watchObservedRunningTime="2025-11-25 15:37:43.467248406 +0000 UTC m=+149.735522197" Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.472562 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" event={"ID":"2b9c93c0-005e-4b54-a498-a4ae8418f839","Type":"ContainerStarted","Data":"2270c9d1361a9b1c63a11a1ca4f0fc35ef770b83ac056ebfdd5ae448cc225407"} Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.477505 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mc6bz" event={"ID":"0841ffab-5cab-4009-8b15-bbab0863a3be","Type":"ContainerStarted","Data":"d9b175315853d9471a2629c7e93ff362f461cd89a419d23e68a516c88fbbb8df"} Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.478142 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-mc6bz" Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.478423 4704 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-7q5r8 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.478466 4704 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7q5r8" podUID="42554b00-c5ca-41d5-b84e-af36e56239c6" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.478867 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mm4pt" Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.485141 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vsp9t" podStartSLOduration=125.485121444 podStartE2EDuration="2m5.485121444s" podCreationTimestamp="2025-11-25 15:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:43.483296368 +0000 UTC m=+149.751570149" watchObservedRunningTime="2025-11-25 15:37:43.485121444 +0000 UTC m=+149.753395225" Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.493639 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:43 crc kubenswrapper[4704]: E1125 15:37:43.493921 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:43.993895172 +0000 UTC m=+150.262168953 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.494644 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:43 crc kubenswrapper[4704]: E1125 15:37:43.495426 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:43.995402989 +0000 UTC m=+150.263676770 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.511774 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-fz52t" podStartSLOduration=125.511751259 podStartE2EDuration="2m5.511751259s" podCreationTimestamp="2025-11-25 15:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:43.508644574 +0000 UTC m=+149.776918375" watchObservedRunningTime="2025-11-25 15:37:43.511751259 +0000 UTC m=+149.780025040" Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.535524 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-qxl8w" podStartSLOduration=126.535498517 podStartE2EDuration="2m6.535498517s" podCreationTimestamp="2025-11-25 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:43.534270439 +0000 UTC m=+149.802544220" watchObservedRunningTime="2025-11-25 15:37:43.535498517 +0000 UTC m=+149.803772298" Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.565667 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mm4pt" podStartSLOduration=125.56564752 podStartE2EDuration="2m5.56564752s" podCreationTimestamp="2025-11-25 15:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:43.563505935 +0000 UTC m=+149.831779716" watchObservedRunningTime="2025-11-25 15:37:43.56564752 +0000 UTC m=+149.833921301" Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.589132 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-mc6bz" podStartSLOduration=11.589109439 podStartE2EDuration="11.589109439s" podCreationTimestamp="2025-11-25 15:37:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:43.588656195 +0000 UTC m=+149.856929976" watchObservedRunningTime="2025-11-25 15:37:43.589109439 +0000 UTC m=+149.857383220" Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.597483 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:43 crc kubenswrapper[4704]: E1125 15:37:43.597691 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:44.097662781 +0000 UTC m=+150.365936562 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.597823 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:43 crc kubenswrapper[4704]: E1125 15:37:43.600953 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:44.100938161 +0000 UTC m=+150.369211942 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.628405 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c7s9c" podStartSLOduration=126.628383092 podStartE2EDuration="2m6.628383092s" podCreationTimestamp="2025-11-25 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:43.627157214 +0000 UTC m=+149.895430995" watchObservedRunningTime="2025-11-25 15:37:43.628383092 +0000 UTC m=+149.896656873" Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.648322 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gff7t" podStartSLOduration=126.648298822 podStartE2EDuration="2m6.648298822s" podCreationTimestamp="2025-11-25 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:43.64595081 +0000 UTC m=+149.914224591" watchObservedRunningTime="2025-11-25 15:37:43.648298822 +0000 UTC m=+149.916572603" Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.682570 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8df8d" podStartSLOduration=125.682550851 podStartE2EDuration="2m5.682550851s" podCreationTimestamp="2025-11-25 15:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:43.681552861 +0000 UTC m=+149.949826652" watchObservedRunningTime="2025-11-25 15:37:43.682550851 +0000 UTC m=+149.950824632" Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.699230 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:43 crc kubenswrapper[4704]: E1125 15:37:43.699436 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:44.199407368 +0000 UTC m=+150.467681149 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.699505 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:43 crc kubenswrapper[4704]: E1125 15:37:43.700094 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:44.200086608 +0000 UTC m=+150.468360389 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.728146 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" podStartSLOduration=125.728123497 podStartE2EDuration="2m5.728123497s" podCreationTimestamp="2025-11-25 15:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:43.718373279 +0000 UTC m=+149.986647060" watchObservedRunningTime="2025-11-25 15:37:43.728123497 +0000 UTC m=+149.996397318" Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.755860 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-7pmpw" Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.755947 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-7pmpw" Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.758069 4704 patch_prober.go:28] interesting pod/console-f9d7485db-7pmpw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.758197 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-7pmpw" podUID="6e167ba8-a633-42df-963a-913ba4fe20bf" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.773885 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tr85d" podStartSLOduration=125.773862098 podStartE2EDuration="2m5.773862098s" podCreationTimestamp="2025-11-25 15:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:43.772317721 +0000 UTC m=+150.040591502" watchObservedRunningTime="2025-11-25 15:37:43.773862098 +0000 UTC m=+150.042135879" Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.784078 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.800884 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:43 crc kubenswrapper[4704]: E1125 15:37:43.801056 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:44.301034271 +0000 UTC m=+150.569308062 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.801632 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:43 crc kubenswrapper[4704]: E1125 15:37:43.801988 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:44.30197885 +0000 UTC m=+150.570252631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.824828 4704 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-gc5rd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.824900 4704 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" podUID="4ce56dcb-a916-41ca-b706-df5e157576eb" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.869743 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s" Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.877628 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s" Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.903323 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:43 crc kubenswrapper[4704]: E1125 15:37:43.905272 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:44.405249893 +0000 UTC m=+150.673523674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.916128 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-gbzgh" Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.917199 4704 patch_prober.go:28] interesting pod/downloads-7954f5f757-gbzgh container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.917225 4704 patch_prober.go:28] interesting pod/downloads-7954f5f757-gbzgh container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.917242 4704 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gbzgh" podUID="fe8e9530-3977-4dc5-abe0-f8c655b58f6a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.917283 4704 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-gbzgh" podUID="fe8e9530-3977-4dc5-abe0-f8c655b58f6a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.917501 4704 patch_prober.go:28] interesting pod/downloads-7954f5f757-gbzgh container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Nov 25 15:37:43 crc kubenswrapper[4704]: I1125 15:37:43.917524 4704 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gbzgh" podUID="fe8e9530-3977-4dc5-abe0-f8c655b58f6a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Nov 25 15:37:44 crc kubenswrapper[4704]: I1125 15:37:44.005198 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:44 crc kubenswrapper[4704]: E1125 15:37:44.005708 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:44.50568876 +0000 UTC m=+150.773962541 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:44 crc kubenswrapper[4704]: I1125 15:37:44.067254 4704 patch_prober.go:28] interesting pod/router-default-5444994796-9whr7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:37:44 crc kubenswrapper[4704]: [-]has-synced failed: reason withheld Nov 25 15:37:44 crc kubenswrapper[4704]: [+]process-running ok Nov 25 15:37:44 crc kubenswrapper[4704]: healthz check failed Nov 25 15:37:44 crc kubenswrapper[4704]: I1125 15:37:44.067386 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9whr7" podUID="2c7e3eb0-06b1-4391-9685-713da13f5bd1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:37:44 crc kubenswrapper[4704]: I1125 15:37:44.106616 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:44 crc kubenswrapper[4704]: E1125 15:37:44.106958 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:44.606935471 +0000 UTC m=+150.875209252 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:44 crc kubenswrapper[4704]: I1125 15:37:44.156778 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:44 crc kubenswrapper[4704]: I1125 15:37:44.157380 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:44 crc kubenswrapper[4704]: I1125 15:37:44.159054 4704 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-cqkjk container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.29:8443/livez\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Nov 25 15:37:44 crc kubenswrapper[4704]: I1125 15:37:44.159136 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" podUID="2b9c93c0-005e-4b54-a498-a4ae8418f839" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.29:8443/livez\": dial tcp 10.217.0.29:8443: connect: connection refused" Nov 25 15:37:44 crc kubenswrapper[4704]: I1125 15:37:44.208464 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:44 crc kubenswrapper[4704]: E1125 15:37:44.208952 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:44.708926425 +0000 UTC m=+150.977200376 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:44 crc kubenswrapper[4704]: I1125 15:37:44.309611 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:44 crc kubenswrapper[4704]: E1125 15:37:44.310188 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:44.810166146 +0000 UTC m=+151.078439927 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:44 crc kubenswrapper[4704]: I1125 15:37:44.410940 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:44 crc kubenswrapper[4704]: E1125 15:37:44.411355 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:44.911338966 +0000 UTC m=+151.179612747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:44 crc kubenswrapper[4704]: I1125 15:37:44.491465 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-z692d" event={"ID":"5d946af5-4a4e-476e-ad32-3eae6ad6c8f7","Type":"ContainerStarted","Data":"65fa94dac75eacf411155d927745b1fc0282be3d709eefc4caebc8a5592ab421"} Nov 25 15:37:44 crc kubenswrapper[4704]: I1125 15:37:44.497013 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"8b9a336a60202f6cde967e4cb77b345127e52695c25cdb81f162fcc3322851ec"} Nov 25 15:37:44 crc kubenswrapper[4704]: I1125 15:37:44.497824 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:37:44 crc kubenswrapper[4704]: I1125 15:37:44.500162 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b5f30bf12107bbddfa842030fcfac70a993abda82e7e4a04e97bb00b5ea0e6f4"} Nov 25 15:37:44 crc kubenswrapper[4704]: I1125 15:37:44.506490 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-b697g" event={"ID":"66e027ec-5c2f-4314-8572-052d7202f17c","Type":"ContainerStarted","Data":"d41864ec1a4743a8f29c7fc471776069bf720ecf2731b21c76c05343e0b1ea63"} Nov 25 15:37:44 crc kubenswrapper[4704]: I1125 15:37:44.512282 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:44 crc kubenswrapper[4704]: E1125 15:37:44.515749 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:45.015729773 +0000 UTC m=+151.284003554 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:44 crc kubenswrapper[4704]: I1125 15:37:44.615727 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:44 crc kubenswrapper[4704]: E1125 15:37:44.621317 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:45.121302497 +0000 UTC m=+151.389576278 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:44 crc kubenswrapper[4704]: I1125 15:37:44.716917 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:44 crc kubenswrapper[4704]: E1125 15:37:44.717213 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:45.217181004 +0000 UTC m=+151.485454785 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:44 crc kubenswrapper[4704]: I1125 15:37:44.717588 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:44 crc kubenswrapper[4704]: E1125 15:37:44.717954 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:45.217938517 +0000 UTC m=+151.486212298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:44 crc kubenswrapper[4704]: I1125 15:37:44.818963 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:44 crc kubenswrapper[4704]: E1125 15:37:44.819507 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:45.319477908 +0000 UTC m=+151.587751689 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:44 crc kubenswrapper[4704]: I1125 15:37:44.819982 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:44 crc kubenswrapper[4704]: E1125 15:37:44.820593 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:45.320576731 +0000 UTC m=+151.588850512 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:44 crc kubenswrapper[4704]: I1125 15:37:44.900320 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-z692d" podStartSLOduration=127.900298563 podStartE2EDuration="2m7.900298563s" podCreationTimestamp="2025-11-25 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:44.899460358 +0000 UTC m=+151.167734139" watchObservedRunningTime="2025-11-25 15:37:44.900298563 +0000 UTC m=+151.168572344" Nov 25 15:37:44 crc kubenswrapper[4704]: I1125 15:37:44.923494 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:44 crc kubenswrapper[4704]: E1125 15:37:44.924289 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:45.424271098 +0000 UTC m=+151.692544879 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.025457 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:45 crc kubenswrapper[4704]: E1125 15:37:45.025818 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:45.525804798 +0000 UTC m=+151.794078579 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.056138 4704 patch_prober.go:28] interesting pod/router-default-5444994796-9whr7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:37:45 crc kubenswrapper[4704]: [-]has-synced failed: reason withheld Nov 25 15:37:45 crc kubenswrapper[4704]: [+]process-running ok Nov 25 15:37:45 crc kubenswrapper[4704]: healthz check failed Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.056212 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9whr7" podUID="2c7e3eb0-06b1-4391-9685-713da13f5bd1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.126966 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:45 crc kubenswrapper[4704]: E1125 15:37:45.128899 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:45.628881935 +0000 UTC m=+151.897155716 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.187611 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.188449 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.193318 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.193551 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.207367 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.231908 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:45 crc kubenswrapper[4704]: E1125 15:37:45.232740 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:45.732726156 +0000 UTC m=+152.000999937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.327552 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tbhnz" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.333451 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.333664 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/473f9f98-2337-4ceb-a91f-6fe3cd2dffc2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"473f9f98-2337-4ceb-a91f-6fe3cd2dffc2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.333696 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/473f9f98-2337-4ceb-a91f-6fe3cd2dffc2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"473f9f98-2337-4ceb-a91f-6fe3cd2dffc2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 15:37:45 crc kubenswrapper[4704]: E1125 15:37:45.333833 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:45.833816443 +0000 UTC m=+152.102090224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.340675 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tbhnz" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.349283 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d8lwt" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.368335 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-d8lwt" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.434673 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/473f9f98-2337-4ceb-a91f-6fe3cd2dffc2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"473f9f98-2337-4ceb-a91f-6fe3cd2dffc2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.434727 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/473f9f98-2337-4ceb-a91f-6fe3cd2dffc2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"473f9f98-2337-4ceb-a91f-6fe3cd2dffc2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.434766 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:45 crc kubenswrapper[4704]: E1125 15:37:45.435524 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:45.935506528 +0000 UTC m=+152.203780489 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.436136 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/473f9f98-2337-4ceb-a91f-6fe3cd2dffc2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"473f9f98-2337-4ceb-a91f-6fe3cd2dffc2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.464845 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/473f9f98-2337-4ceb-a91f-6fe3cd2dffc2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"473f9f98-2337-4ceb-a91f-6fe3cd2dffc2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.521622 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.536046 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:45 crc kubenswrapper[4704]: E1125 15:37:45.536310 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:46.036289455 +0000 UTC m=+152.304563246 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.563261 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-nd74n" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.614983 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-smlnk"] Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.616027 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smlnk" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.620193 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.637378 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:45 crc kubenswrapper[4704]: E1125 15:37:45.644524 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:46.14450705 +0000 UTC m=+152.412780831 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.642779 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-smlnk"] Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.662684 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-nd74n" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.756467 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.756699 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvc5r\" (UniqueName: \"kubernetes.io/projected/09422116-4570-4f3b-bde3-aaebdb318c47-kube-api-access-jvc5r\") pod \"community-operators-smlnk\" (UID: \"09422116-4570-4f3b-bde3-aaebdb318c47\") " pod="openshift-marketplace/community-operators-smlnk" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.756730 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09422116-4570-4f3b-bde3-aaebdb318c47-catalog-content\") pod \"community-operators-smlnk\" (UID: \"09422116-4570-4f3b-bde3-aaebdb318c47\") " pod="openshift-marketplace/community-operators-smlnk" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.756806 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09422116-4570-4f3b-bde3-aaebdb318c47-utilities\") pod \"community-operators-smlnk\" (UID: \"09422116-4570-4f3b-bde3-aaebdb318c47\") " pod="openshift-marketplace/community-operators-smlnk" Nov 25 15:37:45 crc kubenswrapper[4704]: E1125 15:37:45.757280 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:46.257098019 +0000 UTC m=+152.525371830 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.809586 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mx4sl"] Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.828332 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mx4sl" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.845226 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.855166 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mx4sl"] Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.859659 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.859729 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09422116-4570-4f3b-bde3-aaebdb318c47-utilities\") pod \"community-operators-smlnk\" (UID: \"09422116-4570-4f3b-bde3-aaebdb318c47\") " pod="openshift-marketplace/community-operators-smlnk" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.859803 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvc5r\" (UniqueName: \"kubernetes.io/projected/09422116-4570-4f3b-bde3-aaebdb318c47-kube-api-access-jvc5r\") pod \"community-operators-smlnk\" (UID: \"09422116-4570-4f3b-bde3-aaebdb318c47\") " pod="openshift-marketplace/community-operators-smlnk" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.859826 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09422116-4570-4f3b-bde3-aaebdb318c47-catalog-content\") pod \"community-operators-smlnk\" (UID: \"09422116-4570-4f3b-bde3-aaebdb318c47\") " pod="openshift-marketplace/community-operators-smlnk" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.860453 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09422116-4570-4f3b-bde3-aaebdb318c47-catalog-content\") pod \"community-operators-smlnk\" (UID: \"09422116-4570-4f3b-bde3-aaebdb318c47\") " pod="openshift-marketplace/community-operators-smlnk" Nov 25 15:37:45 crc kubenswrapper[4704]: E1125 15:37:45.860758 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:46.360744584 +0000 UTC m=+152.629018365 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.861168 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09422116-4570-4f3b-bde3-aaebdb318c47-utilities\") pod \"community-operators-smlnk\" (UID: \"09422116-4570-4f3b-bde3-aaebdb318c47\") " pod="openshift-marketplace/community-operators-smlnk" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.907421 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvc5r\" (UniqueName: \"kubernetes.io/projected/09422116-4570-4f3b-bde3-aaebdb318c47-kube-api-access-jvc5r\") pod \"community-operators-smlnk\" (UID: \"09422116-4570-4f3b-bde3-aaebdb318c47\") " pod="openshift-marketplace/community-operators-smlnk" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.937412 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xqbjw" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.946510 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smlnk" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.965390 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:45 crc kubenswrapper[4704]: E1125 15:37:45.965507 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:46.465489562 +0000 UTC m=+152.733763343 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.966411 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmfxk\" (UniqueName: \"kubernetes.io/projected/260ec6a9-8914-49dc-8cd8-95c8fa30a29a-kube-api-access-wmfxk\") pod \"certified-operators-mx4sl\" (UID: \"260ec6a9-8914-49dc-8cd8-95c8fa30a29a\") " pod="openshift-marketplace/certified-operators-mx4sl" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.966443 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/260ec6a9-8914-49dc-8cd8-95c8fa30a29a-utilities\") pod \"certified-operators-mx4sl\" (UID: \"260ec6a9-8914-49dc-8cd8-95c8fa30a29a\") " pod="openshift-marketplace/certified-operators-mx4sl" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.966480 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.966534 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/260ec6a9-8914-49dc-8cd8-95c8fa30a29a-catalog-content\") pod \"certified-operators-mx4sl\" (UID: \"260ec6a9-8914-49dc-8cd8-95c8fa30a29a\") " pod="openshift-marketplace/certified-operators-mx4sl" Nov 25 15:37:45 crc kubenswrapper[4704]: E1125 15:37:45.966952 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:46.466943197 +0000 UTC m=+152.735216978 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:45 crc kubenswrapper[4704]: I1125 15:37:45.988595 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-xqbjw" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.041741 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8hvj9" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.058366 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x9rvl"] Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.060258 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-9whr7" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.060580 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x9rvl" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.071032 4704 patch_prober.go:28] interesting pod/router-default-5444994796-9whr7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:37:46 crc kubenswrapper[4704]: [-]has-synced failed: reason withheld Nov 25 15:37:46 crc kubenswrapper[4704]: [+]process-running ok Nov 25 15:37:46 crc kubenswrapper[4704]: healthz check failed Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.071115 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9whr7" podUID="2c7e3eb0-06b1-4391-9685-713da13f5bd1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.071806 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.095466 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmfxk\" (UniqueName: \"kubernetes.io/projected/260ec6a9-8914-49dc-8cd8-95c8fa30a29a-kube-api-access-wmfxk\") pod \"certified-operators-mx4sl\" (UID: \"260ec6a9-8914-49dc-8cd8-95c8fa30a29a\") " pod="openshift-marketplace/certified-operators-mx4sl" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.095537 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/260ec6a9-8914-49dc-8cd8-95c8fa30a29a-utilities\") pod \"certified-operators-mx4sl\" (UID: \"260ec6a9-8914-49dc-8cd8-95c8fa30a29a\") " pod="openshift-marketplace/certified-operators-mx4sl" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.095667 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/260ec6a9-8914-49dc-8cd8-95c8fa30a29a-catalog-content\") pod \"certified-operators-mx4sl\" (UID: \"260ec6a9-8914-49dc-8cd8-95c8fa30a29a\") " pod="openshift-marketplace/certified-operators-mx4sl" Nov 25 15:37:46 crc kubenswrapper[4704]: E1125 15:37:46.095708 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:46.59567775 +0000 UTC m=+152.863951531 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.097011 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/260ec6a9-8914-49dc-8cd8-95c8fa30a29a-catalog-content\") pod \"certified-operators-mx4sl\" (UID: \"260ec6a9-8914-49dc-8cd8-95c8fa30a29a\") " pod="openshift-marketplace/certified-operators-mx4sl" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.098312 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/260ec6a9-8914-49dc-8cd8-95c8fa30a29a-utilities\") pod \"certified-operators-mx4sl\" (UID: \"260ec6a9-8914-49dc-8cd8-95c8fa30a29a\") " pod="openshift-marketplace/certified-operators-mx4sl" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.106820 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x9rvl"] Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.155246 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7q5r8" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.172820 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmfxk\" (UniqueName: \"kubernetes.io/projected/260ec6a9-8914-49dc-8cd8-95c8fa30a29a-kube-api-access-wmfxk\") pod \"certified-operators-mx4sl\" (UID: \"260ec6a9-8914-49dc-8cd8-95c8fa30a29a\") " pod="openshift-marketplace/certified-operators-mx4sl" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.203168 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vnchd"] Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.203634 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72d532a8-8d23-4796-a9eb-80a3aeb3acae-catalog-content\") pod \"community-operators-x9rvl\" (UID: \"72d532a8-8d23-4796-a9eb-80a3aeb3acae\") " pod="openshift-marketplace/community-operators-x9rvl" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.203730 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5gd4\" (UniqueName: \"kubernetes.io/projected/72d532a8-8d23-4796-a9eb-80a3aeb3acae-kube-api-access-l5gd4\") pod \"community-operators-x9rvl\" (UID: \"72d532a8-8d23-4796-a9eb-80a3aeb3acae\") " pod="openshift-marketplace/community-operators-x9rvl" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.203871 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72d532a8-8d23-4796-a9eb-80a3aeb3acae-utilities\") pod \"community-operators-x9rvl\" (UID: \"72d532a8-8d23-4796-a9eb-80a3aeb3acae\") " pod="openshift-marketplace/community-operators-x9rvl" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.203916 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:46 crc kubenswrapper[4704]: E1125 15:37:46.204271 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:46.704252506 +0000 UTC m=+152.972526287 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.204439 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vnchd" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.211722 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mx4sl" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.269700 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.273252 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vnchd"] Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.305612 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.306345 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72d532a8-8d23-4796-a9eb-80a3aeb3acae-utilities\") pod \"community-operators-x9rvl\" (UID: \"72d532a8-8d23-4796-a9eb-80a3aeb3acae\") " pod="openshift-marketplace/community-operators-x9rvl" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.306408 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e025f26-2a87-46e7-a152-84793272fb4b-catalog-content\") pod \"certified-operators-vnchd\" (UID: \"3e025f26-2a87-46e7-a152-84793272fb4b\") " pod="openshift-marketplace/certified-operators-vnchd" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.306435 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72d532a8-8d23-4796-a9eb-80a3aeb3acae-catalog-content\") pod \"community-operators-x9rvl\" (UID: \"72d532a8-8d23-4796-a9eb-80a3aeb3acae\") " pod="openshift-marketplace/community-operators-x9rvl" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.306453 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e025f26-2a87-46e7-a152-84793272fb4b-utilities\") pod \"certified-operators-vnchd\" (UID: \"3e025f26-2a87-46e7-a152-84793272fb4b\") " pod="openshift-marketplace/certified-operators-vnchd" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.306469 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrqdb\" (UniqueName: \"kubernetes.io/projected/3e025f26-2a87-46e7-a152-84793272fb4b-kube-api-access-vrqdb\") pod \"certified-operators-vnchd\" (UID: \"3e025f26-2a87-46e7-a152-84793272fb4b\") " pod="openshift-marketplace/certified-operators-vnchd" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.306497 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5gd4\" (UniqueName: \"kubernetes.io/projected/72d532a8-8d23-4796-a9eb-80a3aeb3acae-kube-api-access-l5gd4\") pod \"community-operators-x9rvl\" (UID: \"72d532a8-8d23-4796-a9eb-80a3aeb3acae\") " pod="openshift-marketplace/community-operators-x9rvl" Nov 25 15:37:46 crc kubenswrapper[4704]: E1125 15:37:46.307472 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:46.807454807 +0000 UTC m=+153.075728588 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.319122 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72d532a8-8d23-4796-a9eb-80a3aeb3acae-utilities\") pod \"community-operators-x9rvl\" (UID: \"72d532a8-8d23-4796-a9eb-80a3aeb3acae\") " pod="openshift-marketplace/community-operators-x9rvl" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.319430 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72d532a8-8d23-4796-a9eb-80a3aeb3acae-catalog-content\") pod \"community-operators-x9rvl\" (UID: \"72d532a8-8d23-4796-a9eb-80a3aeb3acae\") " pod="openshift-marketplace/community-operators-x9rvl" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.378416 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5gd4\" (UniqueName: \"kubernetes.io/projected/72d532a8-8d23-4796-a9eb-80a3aeb3acae-kube-api-access-l5gd4\") pod \"community-operators-x9rvl\" (UID: \"72d532a8-8d23-4796-a9eb-80a3aeb3acae\") " pod="openshift-marketplace/community-operators-x9rvl" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.419657 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.419715 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e025f26-2a87-46e7-a152-84793272fb4b-catalog-content\") pod \"certified-operators-vnchd\" (UID: \"3e025f26-2a87-46e7-a152-84793272fb4b\") " pod="openshift-marketplace/certified-operators-vnchd" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.419741 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e025f26-2a87-46e7-a152-84793272fb4b-utilities\") pod \"certified-operators-vnchd\" (UID: \"3e025f26-2a87-46e7-a152-84793272fb4b\") " pod="openshift-marketplace/certified-operators-vnchd" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.419757 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrqdb\" (UniqueName: \"kubernetes.io/projected/3e025f26-2a87-46e7-a152-84793272fb4b-kube-api-access-vrqdb\") pod \"certified-operators-vnchd\" (UID: \"3e025f26-2a87-46e7-a152-84793272fb4b\") " pod="openshift-marketplace/certified-operators-vnchd" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.420677 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e025f26-2a87-46e7-a152-84793272fb4b-catalog-content\") pod \"certified-operators-vnchd\" (UID: \"3e025f26-2a87-46e7-a152-84793272fb4b\") " pod="openshift-marketplace/certified-operators-vnchd" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.426229 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e025f26-2a87-46e7-a152-84793272fb4b-utilities\") pod \"certified-operators-vnchd\" (UID: \"3e025f26-2a87-46e7-a152-84793272fb4b\") " pod="openshift-marketplace/certified-operators-vnchd" Nov 25 15:37:46 crc kubenswrapper[4704]: E1125 15:37:46.428757 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:46.928734852 +0000 UTC m=+153.197008633 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.454135 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x9rvl" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.461913 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrqdb\" (UniqueName: \"kubernetes.io/projected/3e025f26-2a87-46e7-a152-84793272fb4b-kube-api-access-vrqdb\") pod \"certified-operators-vnchd\" (UID: \"3e025f26-2a87-46e7-a152-84793272fb4b\") " pod="openshift-marketplace/certified-operators-vnchd" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.521287 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:46 crc kubenswrapper[4704]: E1125 15:37:46.521572 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:47.021555946 +0000 UTC m=+153.289829727 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.575067 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-b697g" event={"ID":"66e027ec-5c2f-4314-8572-052d7202f17c","Type":"ContainerStarted","Data":"3f50038e7163cc735c7b4c233ead8da178e6c0bfe01d63652795035a2389a835"} Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.576190 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vnchd" Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.593799 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"473f9f98-2337-4ceb-a91f-6fe3cd2dffc2","Type":"ContainerStarted","Data":"c949be37243e3efaa08e4539af47ba43bb8340ec3f643605af8bd54bb6de571a"} Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.622991 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:46 crc kubenswrapper[4704]: E1125 15:37:46.623392 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:47.123373745 +0000 UTC m=+153.391647516 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.727654 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:46 crc kubenswrapper[4704]: E1125 15:37:46.729994 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:47.22997173 +0000 UTC m=+153.498245511 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.766516 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-smlnk"] Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.829740 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:46 crc kubenswrapper[4704]: E1125 15:37:46.830214 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:47.33019762 +0000 UTC m=+153.598471401 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:46 crc kubenswrapper[4704]: I1125 15:37:46.935608 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:46 crc kubenswrapper[4704]: E1125 15:37:46.936198 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:47.436174986 +0000 UTC m=+153.704448757 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.039372 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:47 crc kubenswrapper[4704]: E1125 15:37:47.039969 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:47.539952535 +0000 UTC m=+153.808226316 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.066437 4704 patch_prober.go:28] interesting pod/router-default-5444994796-9whr7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:37:47 crc kubenswrapper[4704]: [-]has-synced failed: reason withheld Nov 25 15:37:47 crc kubenswrapper[4704]: [+]process-running ok Nov 25 15:37:47 crc kubenswrapper[4704]: healthz check failed Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.066506 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9whr7" podUID="2c7e3eb0-06b1-4391-9685-713da13f5bd1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.140392 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:47 crc kubenswrapper[4704]: E1125 15:37:47.141634 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:47.641605758 +0000 UTC m=+153.909879549 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.159297 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x9rvl"] Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.248469 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:47 crc kubenswrapper[4704]: E1125 15:37:47.248869 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:47.748854653 +0000 UTC m=+154.017128424 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.300259 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mx4sl"] Nov 25 15:37:47 crc kubenswrapper[4704]: W1125 15:37:47.314041 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod260ec6a9_8914_49dc_8cd8_95c8fa30a29a.slice/crio-7dd3c26fef1b6ab2f864fa116cb35c419bb3632be889a464cc8eda23909f673c WatchSource:0}: Error finding container 7dd3c26fef1b6ab2f864fa116cb35c419bb3632be889a464cc8eda23909f673c: Status 404 returned error can't find the container with id 7dd3c26fef1b6ab2f864fa116cb35c419bb3632be889a464cc8eda23909f673c Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.349353 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:47 crc kubenswrapper[4704]: E1125 15:37:47.349904 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:47.849886618 +0000 UTC m=+154.118160399 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.451115 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:47 crc kubenswrapper[4704]: E1125 15:37:47.451932 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:47.951905123 +0000 UTC m=+154.220178904 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.553556 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:47 crc kubenswrapper[4704]: E1125 15:37:47.554635 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:48.054603989 +0000 UTC m=+154.322877760 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.563310 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vnchd"] Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.589182 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mffhk"] Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.595375 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mffhk" Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.605082 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.625501 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mffhk"] Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.645018 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-b697g" event={"ID":"66e027ec-5c2f-4314-8572-052d7202f17c","Type":"ContainerStarted","Data":"3d784e3ab062ee14e05c8dfdd766a33d26e1485c9953d8eb7a24bfe44d72773e"} Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.657466 4704 generic.go:334] "Generic (PLEG): container finished" podID="09422116-4570-4f3b-bde3-aaebdb318c47" containerID="4c3a582ab9bc65b171524f7888e926516ff7564dc738bc8dc33bf67258c8231d" exitCode=0 Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.657564 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smlnk" event={"ID":"09422116-4570-4f3b-bde3-aaebdb318c47","Type":"ContainerDied","Data":"4c3a582ab9bc65b171524f7888e926516ff7564dc738bc8dc33bf67258c8231d"} Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.657592 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smlnk" event={"ID":"09422116-4570-4f3b-bde3-aaebdb318c47","Type":"ContainerStarted","Data":"66ba46f0bc70edc5a7c1f05cbc3a488ab73892eddf91b9da1c221bb5221bbc6a"} Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.659383 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:47 crc kubenswrapper[4704]: E1125 15:37:47.659708 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:48.159695088 +0000 UTC m=+154.427968869 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.661338 4704 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.678482 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnchd" event={"ID":"3e025f26-2a87-46e7-a152-84793272fb4b","Type":"ContainerStarted","Data":"ac0aaaf195cc5cbbd51e5bb429c20e680f844dd33939a2a66958b62af85faf15"} Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.690994 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mx4sl" event={"ID":"260ec6a9-8914-49dc-8cd8-95c8fa30a29a","Type":"ContainerStarted","Data":"7dd3c26fef1b6ab2f864fa116cb35c419bb3632be889a464cc8eda23909f673c"} Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.696855 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"473f9f98-2337-4ceb-a91f-6fe3cd2dffc2","Type":"ContainerStarted","Data":"796444c0e8ea5562fd60a81527c17acff6c4ead769eedbbcb23e2ec7141b1faa"} Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.707778 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9rvl" event={"ID":"72d532a8-8d23-4796-a9eb-80a3aeb3acae","Type":"ContainerStarted","Data":"68704e5ca5830cab0bb46c6e10d7f6dcc452beaa03cac001b12d02a8d99dd051"} Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.707840 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9rvl" event={"ID":"72d532a8-8d23-4796-a9eb-80a3aeb3acae","Type":"ContainerStarted","Data":"97100f37f45484a09a4acee208358ae0dde49f4b1b520193a3d0f01e1160d860"} Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.721315 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.721291275 podStartE2EDuration="2.721291275s" podCreationTimestamp="2025-11-25 15:37:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:47.719462079 +0000 UTC m=+153.987735860" watchObservedRunningTime="2025-11-25 15:37:47.721291275 +0000 UTC m=+153.989565046" Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.761248 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.761515 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h6r6\" (UniqueName: \"kubernetes.io/projected/36f52ec1-c7de-4345-bc4a-4d8fc6f182fe-kube-api-access-5h6r6\") pod \"redhat-marketplace-mffhk\" (UID: \"36f52ec1-c7de-4345-bc4a-4d8fc6f182fe\") " pod="openshift-marketplace/redhat-marketplace-mffhk" Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.761556 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36f52ec1-c7de-4345-bc4a-4d8fc6f182fe-utilities\") pod \"redhat-marketplace-mffhk\" (UID: \"36f52ec1-c7de-4345-bc4a-4d8fc6f182fe\") " pod="openshift-marketplace/redhat-marketplace-mffhk" Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.761650 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36f52ec1-c7de-4345-bc4a-4d8fc6f182fe-catalog-content\") pod \"redhat-marketplace-mffhk\" (UID: \"36f52ec1-c7de-4345-bc4a-4d8fc6f182fe\") " pod="openshift-marketplace/redhat-marketplace-mffhk" Nov 25 15:37:47 crc kubenswrapper[4704]: E1125 15:37:47.761882 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:48.261859588 +0000 UTC m=+154.530133369 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.863270 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36f52ec1-c7de-4345-bc4a-4d8fc6f182fe-catalog-content\") pod \"redhat-marketplace-mffhk\" (UID: \"36f52ec1-c7de-4345-bc4a-4d8fc6f182fe\") " pod="openshift-marketplace/redhat-marketplace-mffhk" Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.863328 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.863378 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h6r6\" (UniqueName: \"kubernetes.io/projected/36f52ec1-c7de-4345-bc4a-4d8fc6f182fe-kube-api-access-5h6r6\") pod \"redhat-marketplace-mffhk\" (UID: \"36f52ec1-c7de-4345-bc4a-4d8fc6f182fe\") " pod="openshift-marketplace/redhat-marketplace-mffhk" Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.863401 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36f52ec1-c7de-4345-bc4a-4d8fc6f182fe-utilities\") pod \"redhat-marketplace-mffhk\" (UID: \"36f52ec1-c7de-4345-bc4a-4d8fc6f182fe\") " pod="openshift-marketplace/redhat-marketplace-mffhk" Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.863922 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36f52ec1-c7de-4345-bc4a-4d8fc6f182fe-catalog-content\") pod \"redhat-marketplace-mffhk\" (UID: \"36f52ec1-c7de-4345-bc4a-4d8fc6f182fe\") " pod="openshift-marketplace/redhat-marketplace-mffhk" Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.864011 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36f52ec1-c7de-4345-bc4a-4d8fc6f182fe-utilities\") pod \"redhat-marketplace-mffhk\" (UID: \"36f52ec1-c7de-4345-bc4a-4d8fc6f182fe\") " pod="openshift-marketplace/redhat-marketplace-mffhk" Nov 25 15:37:47 crc kubenswrapper[4704]: E1125 15:37:47.864309 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:48.364289225 +0000 UTC m=+154.632563016 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.888120 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h6r6\" (UniqueName: \"kubernetes.io/projected/36f52ec1-c7de-4345-bc4a-4d8fc6f182fe-kube-api-access-5h6r6\") pod \"redhat-marketplace-mffhk\" (UID: \"36f52ec1-c7de-4345-bc4a-4d8fc6f182fe\") " pod="openshift-marketplace/redhat-marketplace-mffhk" Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.931393 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mffhk" Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.964677 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:47 crc kubenswrapper[4704]: E1125 15:37:47.964908 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:48.464874606 +0000 UTC m=+154.733148397 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.965378 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:47 crc kubenswrapper[4704]: E1125 15:37:47.965812 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:48.465781904 +0000 UTC m=+154.734055865 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.969971 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gzmll"] Nov 25 15:37:47 crc kubenswrapper[4704]: I1125 15:37:47.971415 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzmll" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.000607 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzmll"] Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.058745 4704 patch_prober.go:28] interesting pod/router-default-5444994796-9whr7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:37:48 crc kubenswrapper[4704]: [-]has-synced failed: reason withheld Nov 25 15:37:48 crc kubenswrapper[4704]: [+]process-running ok Nov 25 15:37:48 crc kubenswrapper[4704]: healthz check failed Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.058847 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9whr7" podUID="2c7e3eb0-06b1-4391-9685-713da13f5bd1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.066769 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.067135 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nks6c\" (UniqueName: \"kubernetes.io/projected/5448c847-73a1-4fdc-8d52-d0b0f4ea5129-kube-api-access-nks6c\") pod \"redhat-marketplace-gzmll\" (UID: \"5448c847-73a1-4fdc-8d52-d0b0f4ea5129\") " pod="openshift-marketplace/redhat-marketplace-gzmll" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.067163 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5448c847-73a1-4fdc-8d52-d0b0f4ea5129-catalog-content\") pod \"redhat-marketplace-gzmll\" (UID: \"5448c847-73a1-4fdc-8d52-d0b0f4ea5129\") " pod="openshift-marketplace/redhat-marketplace-gzmll" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.067200 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5448c847-73a1-4fdc-8d52-d0b0f4ea5129-utilities\") pod \"redhat-marketplace-gzmll\" (UID: \"5448c847-73a1-4fdc-8d52-d0b0f4ea5129\") " pod="openshift-marketplace/redhat-marketplace-gzmll" Nov 25 15:37:48 crc kubenswrapper[4704]: E1125 15:37:48.067328 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:48.567313474 +0000 UTC m=+154.835587255 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.125374 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.126314 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.130309 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.130379 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.134782 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.168815 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nks6c\" (UniqueName: \"kubernetes.io/projected/5448c847-73a1-4fdc-8d52-d0b0f4ea5129-kube-api-access-nks6c\") pod \"redhat-marketplace-gzmll\" (UID: \"5448c847-73a1-4fdc-8d52-d0b0f4ea5129\") " pod="openshift-marketplace/redhat-marketplace-gzmll" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.168870 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5448c847-73a1-4fdc-8d52-d0b0f4ea5129-catalog-content\") pod \"redhat-marketplace-gzmll\" (UID: \"5448c847-73a1-4fdc-8d52-d0b0f4ea5129\") " pod="openshift-marketplace/redhat-marketplace-gzmll" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.168927 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5448c847-73a1-4fdc-8d52-d0b0f4ea5129-utilities\") pod \"redhat-marketplace-gzmll\" (UID: \"5448c847-73a1-4fdc-8d52-d0b0f4ea5129\") " pod="openshift-marketplace/redhat-marketplace-gzmll" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.168976 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:48 crc kubenswrapper[4704]: E1125 15:37:48.169331 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:48.669315299 +0000 UTC m=+154.937589080 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.170266 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5448c847-73a1-4fdc-8d52-d0b0f4ea5129-catalog-content\") pod \"redhat-marketplace-gzmll\" (UID: \"5448c847-73a1-4fdc-8d52-d0b0f4ea5129\") " pod="openshift-marketplace/redhat-marketplace-gzmll" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.170547 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5448c847-73a1-4fdc-8d52-d0b0f4ea5129-utilities\") pod \"redhat-marketplace-gzmll\" (UID: \"5448c847-73a1-4fdc-8d52-d0b0f4ea5129\") " pod="openshift-marketplace/redhat-marketplace-gzmll" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.189660 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nks6c\" (UniqueName: \"kubernetes.io/projected/5448c847-73a1-4fdc-8d52-d0b0f4ea5129-kube-api-access-nks6c\") pod \"redhat-marketplace-gzmll\" (UID: \"5448c847-73a1-4fdc-8d52-d0b0f4ea5129\") " pod="openshift-marketplace/redhat-marketplace-gzmll" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.222623 4704 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.270172 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:48 crc kubenswrapper[4704]: E1125 15:37:48.270710 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:48.770681564 +0000 UTC m=+155.038955345 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.270859 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e206ba36-f94d-46da-af87-c89ea875f4c5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e206ba36-f94d-46da-af87-c89ea875f4c5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.270901 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.270966 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e206ba36-f94d-46da-af87-c89ea875f4c5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e206ba36-f94d-46da-af87-c89ea875f4c5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 15:37:48 crc kubenswrapper[4704]: E1125 15:37:48.271302 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:48.771289822 +0000 UTC m=+155.039563603 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.297858 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzmll" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.378604 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.378868 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e206ba36-f94d-46da-af87-c89ea875f4c5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e206ba36-f94d-46da-af87-c89ea875f4c5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.378948 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e206ba36-f94d-46da-af87-c89ea875f4c5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e206ba36-f94d-46da-af87-c89ea875f4c5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 15:37:48 crc kubenswrapper[4704]: E1125 15:37:48.379047 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:48.879020932 +0000 UTC m=+155.147294713 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.379120 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.379316 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e206ba36-f94d-46da-af87-c89ea875f4c5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e206ba36-f94d-46da-af87-c89ea875f4c5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 15:37:48 crc kubenswrapper[4704]: E1125 15:37:48.379513 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:48.879504417 +0000 UTC m=+155.147778198 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.382450 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mffhk"] Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.411587 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e206ba36-f94d-46da-af87-c89ea875f4c5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e206ba36-f94d-46da-af87-c89ea875f4c5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.445186 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.481776 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:48 crc kubenswrapper[4704]: E1125 15:37:48.481914 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:48.981895844 +0000 UTC m=+155.250169625 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.482086 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:48 crc kubenswrapper[4704]: E1125 15:37:48.482443 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:48.98243284 +0000 UTC m=+155.250706621 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.584653 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:48 crc kubenswrapper[4704]: E1125 15:37:48.584933 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 15:37:49.084910389 +0000 UTC m=+155.353184180 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.585393 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:48 crc kubenswrapper[4704]: E1125 15:37:48.585818 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 15:37:49.085806087 +0000 UTC m=+155.354079868 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qb7gf" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.595896 4704 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-11-25T15:37:48.222646722Z","Handler":null,"Name":""} Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.602942 4704 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.602983 4704 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.671699 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzmll"] Nov 25 15:37:48 crc kubenswrapper[4704]: W1125 15:37:48.687454 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5448c847_73a1_4fdc_8d52_d0b0f4ea5129.slice/crio-d58334644cb8089ee185198f8eaedeaee32b42db214f48db698edbd6ac3783bd WatchSource:0}: Error finding container d58334644cb8089ee185198f8eaedeaee32b42db214f48db698edbd6ac3783bd: Status 404 returned error can't find the container with id d58334644cb8089ee185198f8eaedeaee32b42db214f48db698edbd6ac3783bd Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.687605 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.694985 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.737476 4704 generic.go:334] "Generic (PLEG): container finished" podID="473f9f98-2337-4ceb-a91f-6fe3cd2dffc2" containerID="796444c0e8ea5562fd60a81527c17acff6c4ead769eedbbcb23e2ec7141b1faa" exitCode=0 Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.737593 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"473f9f98-2337-4ceb-a91f-6fe3cd2dffc2","Type":"ContainerDied","Data":"796444c0e8ea5562fd60a81527c17acff6c4ead769eedbbcb23e2ec7141b1faa"} Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.778663 4704 generic.go:334] "Generic (PLEG): container finished" podID="72d532a8-8d23-4796-a9eb-80a3aeb3acae" containerID="68704e5ca5830cab0bb46c6e10d7f6dcc452beaa03cac001b12d02a8d99dd051" exitCode=0 Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.778757 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9rvl" event={"ID":"72d532a8-8d23-4796-a9eb-80a3aeb3acae","Type":"ContainerDied","Data":"68704e5ca5830cab0bb46c6e10d7f6dcc452beaa03cac001b12d02a8d99dd051"} Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.785841 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-94qft"] Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.787123 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-94qft" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.788995 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzmll" event={"ID":"5448c847-73a1-4fdc-8d52-d0b0f4ea5129","Type":"ContainerStarted","Data":"d58334644cb8089ee185198f8eaedeaee32b42db214f48db698edbd6ac3783bd"} Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.789466 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.790474 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.800695 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-b697g" event={"ID":"66e027ec-5c2f-4314-8572-052d7202f17c","Type":"ContainerStarted","Data":"a8254df0f1a56d3c92016715ebf42149a93b16c7dc3127d0944a886c5d163b4b"} Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.800758 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-94qft"] Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.804151 4704 generic.go:334] "Generic (PLEG): container finished" podID="3e025f26-2a87-46e7-a152-84793272fb4b" containerID="09f6279cf74281b27542a52cee0aae733699067167e89815c8b289249ce1d2fc" exitCode=0 Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.804230 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnchd" event={"ID":"3e025f26-2a87-46e7-a152-84793272fb4b","Type":"ContainerDied","Data":"09f6279cf74281b27542a52cee0aae733699067167e89815c8b289249ce1d2fc"} Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.812726 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mffhk" event={"ID":"36f52ec1-c7de-4345-bc4a-4d8fc6f182fe","Type":"ContainerStarted","Data":"ceb9cd6a14d8e2df6c05f3a6ec775f76f19f65bcef90319f556dc7ee8638e9e9"} Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.816536 4704 generic.go:334] "Generic (PLEG): container finished" podID="260ec6a9-8914-49dc-8cd8-95c8fa30a29a" containerID="70beef7ddbd2aea21db1fe5fbf4694740e9a17afb3a13de4efb169ed39636fa7" exitCode=0 Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.816593 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mx4sl" event={"ID":"260ec6a9-8914-49dc-8cd8-95c8fa30a29a","Type":"ContainerDied","Data":"70beef7ddbd2aea21db1fe5fbf4694740e9a17afb3a13de4efb169ed39636fa7"} Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.833765 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.837971 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.838103 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.859859 4704 patch_prober.go:28] interesting pod/apiserver-76f77b778f-z692d container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Nov 25 15:37:48 crc kubenswrapper[4704]: [+]log ok Nov 25 15:37:48 crc kubenswrapper[4704]: [+]etcd ok Nov 25 15:37:48 crc kubenswrapper[4704]: [+]poststarthook/start-apiserver-admission-initializer ok Nov 25 15:37:48 crc kubenswrapper[4704]: [+]poststarthook/generic-apiserver-start-informers ok Nov 25 15:37:48 crc kubenswrapper[4704]: [+]poststarthook/max-in-flight-filter ok Nov 25 15:37:48 crc kubenswrapper[4704]: [+]poststarthook/storage-object-count-tracker-hook ok Nov 25 15:37:48 crc kubenswrapper[4704]: [+]poststarthook/image.openshift.io-apiserver-caches ok Nov 25 15:37:48 crc kubenswrapper[4704]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Nov 25 15:37:48 crc kubenswrapper[4704]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Nov 25 15:37:48 crc kubenswrapper[4704]: [+]poststarthook/project.openshift.io-projectcache ok Nov 25 15:37:48 crc kubenswrapper[4704]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Nov 25 15:37:48 crc kubenswrapper[4704]: [+]poststarthook/openshift.io-startinformers ok Nov 25 15:37:48 crc kubenswrapper[4704]: [+]poststarthook/openshift.io-restmapperupdater ok Nov 25 15:37:48 crc kubenswrapper[4704]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Nov 25 15:37:48 crc kubenswrapper[4704]: livez check failed Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.859919 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-z692d" podUID="5d946af5-4a4e-476e-ad32-3eae6ad6c8f7" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.874950 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-b697g" podStartSLOduration=16.874923133 podStartE2EDuration="16.874923133s" podCreationTimestamp="2025-11-25 15:37:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:48.850493235 +0000 UTC m=+155.118767016" watchObservedRunningTime="2025-11-25 15:37:48.874923133 +0000 UTC m=+155.143196914" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.891740 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgrvk\" (UniqueName: \"kubernetes.io/projected/cdb3b0c5-af6c-4d36-bba3-5a8419a72107-kube-api-access-jgrvk\") pod \"redhat-operators-94qft\" (UID: \"cdb3b0c5-af6c-4d36-bba3-5a8419a72107\") " pod="openshift-marketplace/redhat-operators-94qft" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.891877 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdb3b0c5-af6c-4d36-bba3-5a8419a72107-utilities\") pod \"redhat-operators-94qft\" (UID: \"cdb3b0c5-af6c-4d36-bba3-5a8419a72107\") " pod="openshift-marketplace/redhat-operators-94qft" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.891905 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdb3b0c5-af6c-4d36-bba3-5a8419a72107-catalog-content\") pod \"redhat-operators-94qft\" (UID: \"cdb3b0c5-af6c-4d36-bba3-5a8419a72107\") " pod="openshift-marketplace/redhat-operators-94qft" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.995413 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdb3b0c5-af6c-4d36-bba3-5a8419a72107-utilities\") pod \"redhat-operators-94qft\" (UID: \"cdb3b0c5-af6c-4d36-bba3-5a8419a72107\") " pod="openshift-marketplace/redhat-operators-94qft" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.996499 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdb3b0c5-af6c-4d36-bba3-5a8419a72107-catalog-content\") pod \"redhat-operators-94qft\" (UID: \"cdb3b0c5-af6c-4d36-bba3-5a8419a72107\") " pod="openshift-marketplace/redhat-operators-94qft" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.996499 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdb3b0c5-af6c-4d36-bba3-5a8419a72107-utilities\") pod \"redhat-operators-94qft\" (UID: \"cdb3b0c5-af6c-4d36-bba3-5a8419a72107\") " pod="openshift-marketplace/redhat-operators-94qft" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.996689 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgrvk\" (UniqueName: \"kubernetes.io/projected/cdb3b0c5-af6c-4d36-bba3-5a8419a72107-kube-api-access-jgrvk\") pod \"redhat-operators-94qft\" (UID: \"cdb3b0c5-af6c-4d36-bba3-5a8419a72107\") " pod="openshift-marketplace/redhat-operators-94qft" Nov 25 15:37:48 crc kubenswrapper[4704]: I1125 15:37:48.996902 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdb3b0c5-af6c-4d36-bba3-5a8419a72107-catalog-content\") pod \"redhat-operators-94qft\" (UID: \"cdb3b0c5-af6c-4d36-bba3-5a8419a72107\") " pod="openshift-marketplace/redhat-operators-94qft" Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.019670 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgrvk\" (UniqueName: \"kubernetes.io/projected/cdb3b0c5-af6c-4d36-bba3-5a8419a72107-kube-api-access-jgrvk\") pod \"redhat-operators-94qft\" (UID: \"cdb3b0c5-af6c-4d36-bba3-5a8419a72107\") " pod="openshift-marketplace/redhat-operators-94qft" Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.056782 4704 patch_prober.go:28] interesting pod/router-default-5444994796-9whr7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:37:49 crc kubenswrapper[4704]: [-]has-synced failed: reason withheld Nov 25 15:37:49 crc kubenswrapper[4704]: [+]process-running ok Nov 25 15:37:49 crc kubenswrapper[4704]: healthz check failed Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.056917 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9whr7" podUID="2c7e3eb0-06b1-4391-9685-713da13f5bd1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.117231 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-94qft" Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.168259 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d57hr"] Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.169750 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d57hr" Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.177387 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.197869 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/babe0f08-98f5-4fde-827a-148857d14ebe-utilities\") pod \"redhat-operators-d57hr\" (UID: \"babe0f08-98f5-4fde-827a-148857d14ebe\") " pod="openshift-marketplace/redhat-operators-d57hr" Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.197903 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vntgh\" (UniqueName: \"kubernetes.io/projected/babe0f08-98f5-4fde-827a-148857d14ebe-kube-api-access-vntgh\") pod \"redhat-operators-d57hr\" (UID: \"babe0f08-98f5-4fde-827a-148857d14ebe\") " pod="openshift-marketplace/redhat-operators-d57hr" Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.197943 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/babe0f08-98f5-4fde-827a-148857d14ebe-catalog-content\") pod \"redhat-operators-d57hr\" (UID: \"babe0f08-98f5-4fde-827a-148857d14ebe\") " pod="openshift-marketplace/redhat-operators-d57hr" Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.199548 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d57hr"] Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.208336 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cqkjk" Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.290292 4704 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.290800 4704 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.298871 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/babe0f08-98f5-4fde-827a-148857d14ebe-catalog-content\") pod \"redhat-operators-d57hr\" (UID: \"babe0f08-98f5-4fde-827a-148857d14ebe\") " pod="openshift-marketplace/redhat-operators-d57hr" Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.299000 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/babe0f08-98f5-4fde-827a-148857d14ebe-utilities\") pod \"redhat-operators-d57hr\" (UID: \"babe0f08-98f5-4fde-827a-148857d14ebe\") " pod="openshift-marketplace/redhat-operators-d57hr" Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.299020 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vntgh\" (UniqueName: \"kubernetes.io/projected/babe0f08-98f5-4fde-827a-148857d14ebe-kube-api-access-vntgh\") pod \"redhat-operators-d57hr\" (UID: \"babe0f08-98f5-4fde-827a-148857d14ebe\") " pod="openshift-marketplace/redhat-operators-d57hr" Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.300118 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/babe0f08-98f5-4fde-827a-148857d14ebe-catalog-content\") pod \"redhat-operators-d57hr\" (UID: \"babe0f08-98f5-4fde-827a-148857d14ebe\") " pod="openshift-marketplace/redhat-operators-d57hr" Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.300344 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/babe0f08-98f5-4fde-827a-148857d14ebe-utilities\") pod \"redhat-operators-d57hr\" (UID: \"babe0f08-98f5-4fde-827a-148857d14ebe\") " pod="openshift-marketplace/redhat-operators-d57hr" Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.332348 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vntgh\" (UniqueName: \"kubernetes.io/projected/babe0f08-98f5-4fde-827a-148857d14ebe-kube-api-access-vntgh\") pod \"redhat-operators-d57hr\" (UID: \"babe0f08-98f5-4fde-827a-148857d14ebe\") " pod="openshift-marketplace/redhat-operators-d57hr" Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.418526 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-94qft"] Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.487058 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d57hr" Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.633754 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qb7gf\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.687392 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.725767 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d57hr"] Nov 25 15:37:49 crc kubenswrapper[4704]: W1125 15:37:49.796295 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbabe0f08_98f5_4fde_827a_148857d14ebe.slice/crio-25fa03f23a85fc99ba71f7def1d9c90c39980c91b554a9b1a25340dbe125f1d5 WatchSource:0}: Error finding container 25fa03f23a85fc99ba71f7def1d9c90c39980c91b554a9b1a25340dbe125f1d5: Status 404 returned error can't find the container with id 25fa03f23a85fc99ba71f7def1d9c90c39980c91b554a9b1a25340dbe125f1d5 Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.823832 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mffhk" event={"ID":"36f52ec1-c7de-4345-bc4a-4d8fc6f182fe","Type":"ContainerDied","Data":"adb952b1edb22107b9d420a4dc1bd85e8c68d180c10c50a6deed2ee7490a95d7"} Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.823784 4704 generic.go:334] "Generic (PLEG): container finished" podID="36f52ec1-c7de-4345-bc4a-4d8fc6f182fe" containerID="adb952b1edb22107b9d420a4dc1bd85e8c68d180c10c50a6deed2ee7490a95d7" exitCode=0 Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.830853 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-94qft" event={"ID":"cdb3b0c5-af6c-4d36-bba3-5a8419a72107","Type":"ContainerStarted","Data":"25c53006bdf190422531f0dcc52b3e6310ea1e6788d085287fba4ab8708750a0"} Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.831916 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e206ba36-f94d-46da-af87-c89ea875f4c5","Type":"ContainerStarted","Data":"7dbdfaa946b5977898e5e52d610ec003150c483847261d618d017e92a4e30f2c"} Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.831942 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e206ba36-f94d-46da-af87-c89ea875f4c5","Type":"ContainerStarted","Data":"7ca95949d9c2cfcf4ff0d9a97b23d5fa81fc50721c102cba063b68d3bad10c00"} Nov 25 15:37:49 crc kubenswrapper[4704]: I1125 15:37:49.839954 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d57hr" event={"ID":"babe0f08-98f5-4fde-827a-148857d14ebe","Type":"ContainerStarted","Data":"25fa03f23a85fc99ba71f7def1d9c90c39980c91b554a9b1a25340dbe125f1d5"} Nov 25 15:37:50 crc kubenswrapper[4704]: I1125 15:37:50.051992 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qb7gf"] Nov 25 15:37:50 crc kubenswrapper[4704]: I1125 15:37:50.062072 4704 patch_prober.go:28] interesting pod/router-default-5444994796-9whr7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:37:50 crc kubenswrapper[4704]: [-]has-synced failed: reason withheld Nov 25 15:37:50 crc kubenswrapper[4704]: [+]process-running ok Nov 25 15:37:50 crc kubenswrapper[4704]: healthz check failed Nov 25 15:37:50 crc kubenswrapper[4704]: I1125 15:37:50.062131 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9whr7" podUID="2c7e3eb0-06b1-4391-9685-713da13f5bd1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:37:50 crc kubenswrapper[4704]: W1125 15:37:50.128216 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97aea51f_7b9e_44f2_a310_8a27cc66f8d9.slice/crio-4b1eb8a36a5a83a2440695304a99e241e1bae86ffce2fd66da5c1489a342c49d WatchSource:0}: Error finding container 4b1eb8a36a5a83a2440695304a99e241e1bae86ffce2fd66da5c1489a342c49d: Status 404 returned error can't find the container with id 4b1eb8a36a5a83a2440695304a99e241e1bae86ffce2fd66da5c1489a342c49d Nov 25 15:37:50 crc kubenswrapper[4704]: I1125 15:37:50.132135 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 15:37:50 crc kubenswrapper[4704]: I1125 15:37:50.314483 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/473f9f98-2337-4ceb-a91f-6fe3cd2dffc2-kube-api-access\") pod \"473f9f98-2337-4ceb-a91f-6fe3cd2dffc2\" (UID: \"473f9f98-2337-4ceb-a91f-6fe3cd2dffc2\") " Nov 25 15:37:50 crc kubenswrapper[4704]: I1125 15:37:50.314563 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/473f9f98-2337-4ceb-a91f-6fe3cd2dffc2-kubelet-dir\") pod \"473f9f98-2337-4ceb-a91f-6fe3cd2dffc2\" (UID: \"473f9f98-2337-4ceb-a91f-6fe3cd2dffc2\") " Nov 25 15:37:50 crc kubenswrapper[4704]: I1125 15:37:50.314731 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/473f9f98-2337-4ceb-a91f-6fe3cd2dffc2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "473f9f98-2337-4ceb-a91f-6fe3cd2dffc2" (UID: "473f9f98-2337-4ceb-a91f-6fe3cd2dffc2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:37:50 crc kubenswrapper[4704]: I1125 15:37:50.322267 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/473f9f98-2337-4ceb-a91f-6fe3cd2dffc2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "473f9f98-2337-4ceb-a91f-6fe3cd2dffc2" (UID: "473f9f98-2337-4ceb-a91f-6fe3cd2dffc2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:37:50 crc kubenswrapper[4704]: I1125 15:37:50.416231 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/473f9f98-2337-4ceb-a91f-6fe3cd2dffc2-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 15:37:50 crc kubenswrapper[4704]: I1125 15:37:50.416274 4704 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/473f9f98-2337-4ceb-a91f-6fe3cd2dffc2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 25 15:37:50 crc kubenswrapper[4704]: I1125 15:37:50.427521 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Nov 25 15:37:50 crc kubenswrapper[4704]: I1125 15:37:50.549236 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:37:50 crc kubenswrapper[4704]: I1125 15:37:50.849248 4704 generic.go:334] "Generic (PLEG): container finished" podID="cdb3b0c5-af6c-4d36-bba3-5a8419a72107" containerID="1602d8c38f0b068631e8b40e66c5d572f5ad77fd1fa9014d89f26422eac0bec1" exitCode=0 Nov 25 15:37:50 crc kubenswrapper[4704]: I1125 15:37:50.849341 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-94qft" event={"ID":"cdb3b0c5-af6c-4d36-bba3-5a8419a72107","Type":"ContainerDied","Data":"1602d8c38f0b068631e8b40e66c5d572f5ad77fd1fa9014d89f26422eac0bec1"} Nov 25 15:37:50 crc kubenswrapper[4704]: I1125 15:37:50.853483 4704 generic.go:334] "Generic (PLEG): container finished" podID="5448c847-73a1-4fdc-8d52-d0b0f4ea5129" containerID="d8986572938f73df99c95558ffc0e1b9eb720c7ea80f731b2825b691d5c53b50" exitCode=0 Nov 25 15:37:50 crc kubenswrapper[4704]: I1125 15:37:50.853560 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzmll" event={"ID":"5448c847-73a1-4fdc-8d52-d0b0f4ea5129","Type":"ContainerDied","Data":"d8986572938f73df99c95558ffc0e1b9eb720c7ea80f731b2825b691d5c53b50"} Nov 25 15:37:50 crc kubenswrapper[4704]: I1125 15:37:50.858036 4704 generic.go:334] "Generic (PLEG): container finished" podID="e206ba36-f94d-46da-af87-c89ea875f4c5" containerID="7dbdfaa946b5977898e5e52d610ec003150c483847261d618d017e92a4e30f2c" exitCode=0 Nov 25 15:37:50 crc kubenswrapper[4704]: I1125 15:37:50.858174 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e206ba36-f94d-46da-af87-c89ea875f4c5","Type":"ContainerDied","Data":"7dbdfaa946b5977898e5e52d610ec003150c483847261d618d017e92a4e30f2c"} Nov 25 15:37:50 crc kubenswrapper[4704]: I1125 15:37:50.860811 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d57hr" event={"ID":"babe0f08-98f5-4fde-827a-148857d14ebe","Type":"ContainerDied","Data":"f341ce98d4e9ecacadbce59d4c73895573b4da41c4a2048a581125d4b78ade3f"} Nov 25 15:37:50 crc kubenswrapper[4704]: I1125 15:37:50.861097 4704 generic.go:334] "Generic (PLEG): container finished" podID="babe0f08-98f5-4fde-827a-148857d14ebe" containerID="f341ce98d4e9ecacadbce59d4c73895573b4da41c4a2048a581125d4b78ade3f" exitCode=0 Nov 25 15:37:50 crc kubenswrapper[4704]: I1125 15:37:50.862958 4704 generic.go:334] "Generic (PLEG): container finished" podID="38b9ead9-033f-44cb-9657-6a078bed2c0d" containerID="535b6f8581345683bf90ded26a97ff2e554d4df43f8222991d8cb64e5d622052" exitCode=0 Nov 25 15:37:50 crc kubenswrapper[4704]: I1125 15:37:50.863097 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-8g42t" event={"ID":"38b9ead9-033f-44cb-9657-6a078bed2c0d","Type":"ContainerDied","Data":"535b6f8581345683bf90ded26a97ff2e554d4df43f8222991d8cb64e5d622052"} Nov 25 15:37:50 crc kubenswrapper[4704]: I1125 15:37:50.864943 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" event={"ID":"97aea51f-7b9e-44f2-a310-8a27cc66f8d9","Type":"ContainerStarted","Data":"3efdca317811d9aa436d3ad8123fef082412fb166be72f895a197a34bf42d286"} Nov 25 15:37:50 crc kubenswrapper[4704]: I1125 15:37:50.864983 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" event={"ID":"97aea51f-7b9e-44f2-a310-8a27cc66f8d9","Type":"ContainerStarted","Data":"4b1eb8a36a5a83a2440695304a99e241e1bae86ffce2fd66da5c1489a342c49d"} Nov 25 15:37:50 crc kubenswrapper[4704]: I1125 15:37:50.865124 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:37:50 crc kubenswrapper[4704]: I1125 15:37:50.866703 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"473f9f98-2337-4ceb-a91f-6fe3cd2dffc2","Type":"ContainerDied","Data":"c949be37243e3efaa08e4539af47ba43bb8340ec3f643605af8bd54bb6de571a"} Nov 25 15:37:50 crc kubenswrapper[4704]: I1125 15:37:50.866751 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c949be37243e3efaa08e4539af47ba43bb8340ec3f643605af8bd54bb6de571a" Nov 25 15:37:50 crc kubenswrapper[4704]: I1125 15:37:50.866800 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 15:37:50 crc kubenswrapper[4704]: I1125 15:37:50.906136 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" podStartSLOduration=133.906114011 podStartE2EDuration="2m13.906114011s" podCreationTimestamp="2025-11-25 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:37:50.903172891 +0000 UTC m=+157.171446672" watchObservedRunningTime="2025-11-25 15:37:50.906114011 +0000 UTC m=+157.174387792" Nov 25 15:37:51 crc kubenswrapper[4704]: I1125 15:37:51.060033 4704 patch_prober.go:28] interesting pod/router-default-5444994796-9whr7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:37:51 crc kubenswrapper[4704]: [-]has-synced failed: reason withheld Nov 25 15:37:51 crc kubenswrapper[4704]: [+]process-running ok Nov 25 15:37:51 crc kubenswrapper[4704]: healthz check failed Nov 25 15:37:51 crc kubenswrapper[4704]: I1125 15:37:51.060129 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9whr7" podUID="2c7e3eb0-06b1-4391-9685-713da13f5bd1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:37:51 crc kubenswrapper[4704]: I1125 15:37:51.065951 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-mc6bz" Nov 25 15:37:52 crc kubenswrapper[4704]: I1125 15:37:52.054835 4704 patch_prober.go:28] interesting pod/router-default-5444994796-9whr7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:37:52 crc kubenswrapper[4704]: [-]has-synced failed: reason withheld Nov 25 15:37:52 crc kubenswrapper[4704]: [+]process-running ok Nov 25 15:37:52 crc kubenswrapper[4704]: healthz check failed Nov 25 15:37:52 crc kubenswrapper[4704]: I1125 15:37:52.055213 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9whr7" podUID="2c7e3eb0-06b1-4391-9685-713da13f5bd1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:37:52 crc kubenswrapper[4704]: I1125 15:37:52.104721 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 15:37:52 crc kubenswrapper[4704]: I1125 15:37:52.198573 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-8g42t" Nov 25 15:37:52 crc kubenswrapper[4704]: I1125 15:37:52.241710 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e206ba36-f94d-46da-af87-c89ea875f4c5-kube-api-access\") pod \"e206ba36-f94d-46da-af87-c89ea875f4c5\" (UID: \"e206ba36-f94d-46da-af87-c89ea875f4c5\") " Nov 25 15:37:52 crc kubenswrapper[4704]: I1125 15:37:52.241846 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e206ba36-f94d-46da-af87-c89ea875f4c5-kubelet-dir\") pod \"e206ba36-f94d-46da-af87-c89ea875f4c5\" (UID: \"e206ba36-f94d-46da-af87-c89ea875f4c5\") " Nov 25 15:37:52 crc kubenswrapper[4704]: I1125 15:37:52.242184 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e206ba36-f94d-46da-af87-c89ea875f4c5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e206ba36-f94d-46da-af87-c89ea875f4c5" (UID: "e206ba36-f94d-46da-af87-c89ea875f4c5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:37:52 crc kubenswrapper[4704]: I1125 15:37:52.248415 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e206ba36-f94d-46da-af87-c89ea875f4c5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e206ba36-f94d-46da-af87-c89ea875f4c5" (UID: "e206ba36-f94d-46da-af87-c89ea875f4c5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:37:52 crc kubenswrapper[4704]: I1125 15:37:52.343084 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38b9ead9-033f-44cb-9657-6a078bed2c0d-config-volume\") pod \"38b9ead9-033f-44cb-9657-6a078bed2c0d\" (UID: \"38b9ead9-033f-44cb-9657-6a078bed2c0d\") " Nov 25 15:37:52 crc kubenswrapper[4704]: I1125 15:37:52.343211 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38b9ead9-033f-44cb-9657-6a078bed2c0d-secret-volume\") pod \"38b9ead9-033f-44cb-9657-6a078bed2c0d\" (UID: \"38b9ead9-033f-44cb-9657-6a078bed2c0d\") " Nov 25 15:37:52 crc kubenswrapper[4704]: I1125 15:37:52.343264 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t992b\" (UniqueName: \"kubernetes.io/projected/38b9ead9-033f-44cb-9657-6a078bed2c0d-kube-api-access-t992b\") pod \"38b9ead9-033f-44cb-9657-6a078bed2c0d\" (UID: \"38b9ead9-033f-44cb-9657-6a078bed2c0d\") " Nov 25 15:37:52 crc kubenswrapper[4704]: I1125 15:37:52.343524 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e206ba36-f94d-46da-af87-c89ea875f4c5-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 15:37:52 crc kubenswrapper[4704]: I1125 15:37:52.343540 4704 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e206ba36-f94d-46da-af87-c89ea875f4c5-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 25 15:37:52 crc kubenswrapper[4704]: I1125 15:37:52.344311 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38b9ead9-033f-44cb-9657-6a078bed2c0d-config-volume" (OuterVolumeSpecName: "config-volume") pod "38b9ead9-033f-44cb-9657-6a078bed2c0d" (UID: "38b9ead9-033f-44cb-9657-6a078bed2c0d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:37:52 crc kubenswrapper[4704]: I1125 15:37:52.347076 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38b9ead9-033f-44cb-9657-6a078bed2c0d-kube-api-access-t992b" (OuterVolumeSpecName: "kube-api-access-t992b") pod "38b9ead9-033f-44cb-9657-6a078bed2c0d" (UID: "38b9ead9-033f-44cb-9657-6a078bed2c0d"). InnerVolumeSpecName "kube-api-access-t992b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:37:52 crc kubenswrapper[4704]: I1125 15:37:52.349890 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38b9ead9-033f-44cb-9657-6a078bed2c0d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "38b9ead9-033f-44cb-9657-6a078bed2c0d" (UID: "38b9ead9-033f-44cb-9657-6a078bed2c0d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:37:52 crc kubenswrapper[4704]: I1125 15:37:52.445299 4704 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38b9ead9-033f-44cb-9657-6a078bed2c0d-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 25 15:37:52 crc kubenswrapper[4704]: I1125 15:37:52.445334 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t992b\" (UniqueName: \"kubernetes.io/projected/38b9ead9-033f-44cb-9657-6a078bed2c0d-kube-api-access-t992b\") on node \"crc\" DevicePath \"\"" Nov 25 15:37:52 crc kubenswrapper[4704]: I1125 15:37:52.445346 4704 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38b9ead9-033f-44cb-9657-6a078bed2c0d-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 15:37:52 crc kubenswrapper[4704]: I1125 15:37:52.878543 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e206ba36-f94d-46da-af87-c89ea875f4c5","Type":"ContainerDied","Data":"7ca95949d9c2cfcf4ff0d9a97b23d5fa81fc50721c102cba063b68d3bad10c00"} Nov 25 15:37:52 crc kubenswrapper[4704]: I1125 15:37:52.878592 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ca95949d9c2cfcf4ff0d9a97b23d5fa81fc50721c102cba063b68d3bad10c00" Nov 25 15:37:52 crc kubenswrapper[4704]: I1125 15:37:52.878588 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 15:37:52 crc kubenswrapper[4704]: I1125 15:37:52.880882 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-8g42t" event={"ID":"38b9ead9-033f-44cb-9657-6a078bed2c0d","Type":"ContainerDied","Data":"b971bccd687ebc803beba0b3d32c8b57b1fbf488f2b50765c372355d5971615a"} Nov 25 15:37:52 crc kubenswrapper[4704]: I1125 15:37:52.880904 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b971bccd687ebc803beba0b3d32c8b57b1fbf488f2b50765c372355d5971615a" Nov 25 15:37:52 crc kubenswrapper[4704]: I1125 15:37:52.880936 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401410-8g42t" Nov 25 15:37:53 crc kubenswrapper[4704]: I1125 15:37:53.055775 4704 patch_prober.go:28] interesting pod/router-default-5444994796-9whr7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:37:53 crc kubenswrapper[4704]: [-]has-synced failed: reason withheld Nov 25 15:37:53 crc kubenswrapper[4704]: [+]process-running ok Nov 25 15:37:53 crc kubenswrapper[4704]: healthz check failed Nov 25 15:37:53 crc kubenswrapper[4704]: I1125 15:37:53.055959 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9whr7" podUID="2c7e3eb0-06b1-4391-9685-713da13f5bd1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:37:53 crc kubenswrapper[4704]: I1125 15:37:53.768676 4704 patch_prober.go:28] interesting pod/console-f9d7485db-7pmpw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Nov 25 15:37:53 crc kubenswrapper[4704]: I1125 15:37:53.768753 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-7pmpw" podUID="6e167ba8-a633-42df-963a-913ba4fe20bf" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Nov 25 15:37:53 crc kubenswrapper[4704]: I1125 15:37:53.828989 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" Nov 25 15:37:53 crc kubenswrapper[4704]: I1125 15:37:53.843725 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:53 crc kubenswrapper[4704]: I1125 15:37:53.852863 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-z692d" Nov 25 15:37:53 crc kubenswrapper[4704]: I1125 15:37:53.916903 4704 patch_prober.go:28] interesting pod/downloads-7954f5f757-gbzgh container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Nov 25 15:37:53 crc kubenswrapper[4704]: I1125 15:37:53.916938 4704 patch_prober.go:28] interesting pod/downloads-7954f5f757-gbzgh container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Nov 25 15:37:53 crc kubenswrapper[4704]: I1125 15:37:53.916979 4704 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-gbzgh" podUID="fe8e9530-3977-4dc5-abe0-f8c655b58f6a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Nov 25 15:37:53 crc kubenswrapper[4704]: I1125 15:37:53.916994 4704 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gbzgh" podUID="fe8e9530-3977-4dc5-abe0-f8c655b58f6a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Nov 25 15:37:54 crc kubenswrapper[4704]: I1125 15:37:54.055687 4704 patch_prober.go:28] interesting pod/router-default-5444994796-9whr7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:37:54 crc kubenswrapper[4704]: [-]has-synced failed: reason withheld Nov 25 15:37:54 crc kubenswrapper[4704]: [+]process-running ok Nov 25 15:37:54 crc kubenswrapper[4704]: healthz check failed Nov 25 15:37:54 crc kubenswrapper[4704]: I1125 15:37:54.055758 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9whr7" podUID="2c7e3eb0-06b1-4391-9685-713da13f5bd1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:37:55 crc kubenswrapper[4704]: I1125 15:37:55.054321 4704 patch_prober.go:28] interesting pod/router-default-5444994796-9whr7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:37:55 crc kubenswrapper[4704]: [-]has-synced failed: reason withheld Nov 25 15:37:55 crc kubenswrapper[4704]: [+]process-running ok Nov 25 15:37:55 crc kubenswrapper[4704]: healthz check failed Nov 25 15:37:55 crc kubenswrapper[4704]: I1125 15:37:55.054390 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9whr7" podUID="2c7e3eb0-06b1-4391-9685-713da13f5bd1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:37:56 crc kubenswrapper[4704]: I1125 15:37:56.054877 4704 patch_prober.go:28] interesting pod/router-default-5444994796-9whr7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:37:56 crc kubenswrapper[4704]: [-]has-synced failed: reason withheld Nov 25 15:37:56 crc kubenswrapper[4704]: [+]process-running ok Nov 25 15:37:56 crc kubenswrapper[4704]: healthz check failed Nov 25 15:37:56 crc kubenswrapper[4704]: I1125 15:37:56.054989 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9whr7" podUID="2c7e3eb0-06b1-4391-9685-713da13f5bd1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:37:57 crc kubenswrapper[4704]: I1125 15:37:57.054654 4704 patch_prober.go:28] interesting pod/router-default-5444994796-9whr7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:37:57 crc kubenswrapper[4704]: [-]has-synced failed: reason withheld Nov 25 15:37:57 crc kubenswrapper[4704]: [+]process-running ok Nov 25 15:37:57 crc kubenswrapper[4704]: healthz check failed Nov 25 15:37:57 crc kubenswrapper[4704]: I1125 15:37:57.054752 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9whr7" podUID="2c7e3eb0-06b1-4391-9685-713da13f5bd1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:37:58 crc kubenswrapper[4704]: I1125 15:37:58.054477 4704 patch_prober.go:28] interesting pod/router-default-5444994796-9whr7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:37:58 crc kubenswrapper[4704]: [-]has-synced failed: reason withheld Nov 25 15:37:58 crc kubenswrapper[4704]: [+]process-running ok Nov 25 15:37:58 crc kubenswrapper[4704]: healthz check failed Nov 25 15:37:58 crc kubenswrapper[4704]: I1125 15:37:58.054557 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9whr7" podUID="2c7e3eb0-06b1-4391-9685-713da13f5bd1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:37:59 crc kubenswrapper[4704]: I1125 15:37:59.054449 4704 patch_prober.go:28] interesting pod/router-default-5444994796-9whr7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:37:59 crc kubenswrapper[4704]: [-]has-synced failed: reason withheld Nov 25 15:37:59 crc kubenswrapper[4704]: [+]process-running ok Nov 25 15:37:59 crc kubenswrapper[4704]: healthz check failed Nov 25 15:37:59 crc kubenswrapper[4704]: I1125 15:37:59.054555 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9whr7" podUID="2c7e3eb0-06b1-4391-9685-713da13f5bd1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:38:00 crc kubenswrapper[4704]: I1125 15:38:00.055462 4704 patch_prober.go:28] interesting pod/router-default-5444994796-9whr7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:38:00 crc kubenswrapper[4704]: [-]has-synced failed: reason withheld Nov 25 15:38:00 crc kubenswrapper[4704]: [+]process-running ok Nov 25 15:38:00 crc kubenswrapper[4704]: healthz check failed Nov 25 15:38:00 crc kubenswrapper[4704]: I1125 15:38:00.055996 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9whr7" podUID="2c7e3eb0-06b1-4391-9685-713da13f5bd1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:38:00 crc kubenswrapper[4704]: I1125 15:38:00.298397 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-metrics-certs\") pod \"network-metrics-daemon-z6lnx\" (UID: \"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\") " pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:38:00 crc kubenswrapper[4704]: I1125 15:38:00.316865 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9cf8fad-2f72-4a94-958b-dd58fc76f4df-metrics-certs\") pod \"network-metrics-daemon-z6lnx\" (UID: \"b9cf8fad-2f72-4a94-958b-dd58fc76f4df\") " pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:38:00 crc kubenswrapper[4704]: I1125 15:38:00.449602 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z6lnx" Nov 25 15:38:01 crc kubenswrapper[4704]: I1125 15:38:01.055096 4704 patch_prober.go:28] interesting pod/router-default-5444994796-9whr7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:38:01 crc kubenswrapper[4704]: [-]has-synced failed: reason withheld Nov 25 15:38:01 crc kubenswrapper[4704]: [+]process-running ok Nov 25 15:38:01 crc kubenswrapper[4704]: healthz check failed Nov 25 15:38:01 crc kubenswrapper[4704]: I1125 15:38:01.055183 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9whr7" podUID="2c7e3eb0-06b1-4391-9685-713da13f5bd1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:38:02 crc kubenswrapper[4704]: I1125 15:38:02.056024 4704 patch_prober.go:28] interesting pod/router-default-5444994796-9whr7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:38:02 crc kubenswrapper[4704]: [-]has-synced failed: reason withheld Nov 25 15:38:02 crc kubenswrapper[4704]: [+]process-running ok Nov 25 15:38:02 crc kubenswrapper[4704]: healthz check failed Nov 25 15:38:02 crc kubenswrapper[4704]: I1125 15:38:02.056091 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9whr7" podUID="2c7e3eb0-06b1-4391-9685-713da13f5bd1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:38:03 crc kubenswrapper[4704]: I1125 15:38:03.055654 4704 patch_prober.go:28] interesting pod/router-default-5444994796-9whr7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 15:38:03 crc kubenswrapper[4704]: [-]has-synced failed: reason withheld Nov 25 15:38:03 crc kubenswrapper[4704]: [+]process-running ok Nov 25 15:38:03 crc kubenswrapper[4704]: healthz check failed Nov 25 15:38:03 crc kubenswrapper[4704]: I1125 15:38:03.055739 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9whr7" podUID="2c7e3eb0-06b1-4391-9685-713da13f5bd1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 15:38:03 crc kubenswrapper[4704]: I1125 15:38:03.755650 4704 patch_prober.go:28] interesting pod/console-f9d7485db-7pmpw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Nov 25 15:38:03 crc kubenswrapper[4704]: I1125 15:38:03.755722 4704 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-7pmpw" podUID="6e167ba8-a633-42df-963a-913ba4fe20bf" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Nov 25 15:38:03 crc kubenswrapper[4704]: I1125 15:38:03.917517 4704 patch_prober.go:28] interesting pod/downloads-7954f5f757-gbzgh container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Nov 25 15:38:03 crc kubenswrapper[4704]: I1125 15:38:03.917540 4704 patch_prober.go:28] interesting pod/downloads-7954f5f757-gbzgh container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Nov 25 15:38:03 crc kubenswrapper[4704]: I1125 15:38:03.917579 4704 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-gbzgh" podUID="fe8e9530-3977-4dc5-abe0-f8c655b58f6a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Nov 25 15:38:03 crc kubenswrapper[4704]: I1125 15:38:03.917642 4704 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-gbzgh" Nov 25 15:38:03 crc kubenswrapper[4704]: I1125 15:38:03.917579 4704 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gbzgh" podUID="fe8e9530-3977-4dc5-abe0-f8c655b58f6a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Nov 25 15:38:03 crc kubenswrapper[4704]: I1125 15:38:03.918193 4704 patch_prober.go:28] interesting pod/downloads-7954f5f757-gbzgh container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Nov 25 15:38:03 crc kubenswrapper[4704]: I1125 15:38:03.918220 4704 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gbzgh" podUID="fe8e9530-3977-4dc5-abe0-f8c655b58f6a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Nov 25 15:38:03 crc kubenswrapper[4704]: I1125 15:38:03.918235 4704 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"fe4b0240cec4d669c84a95a962a07de62164469f517807bf2ab19861e67b2575"} pod="openshift-console/downloads-7954f5f757-gbzgh" containerMessage="Container download-server failed liveness probe, will be restarted" Nov 25 15:38:03 crc kubenswrapper[4704]: I1125 15:38:03.918308 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-gbzgh" podUID="fe8e9530-3977-4dc5-abe0-f8c655b58f6a" containerName="download-server" containerID="cri-o://fe4b0240cec4d669c84a95a962a07de62164469f517807bf2ab19861e67b2575" gracePeriod=2 Nov 25 15:38:04 crc kubenswrapper[4704]: I1125 15:38:04.179252 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-9whr7" Nov 25 15:38:04 crc kubenswrapper[4704]: I1125 15:38:04.185234 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-9whr7" Nov 25 15:38:05 crc kubenswrapper[4704]: I1125 15:38:05.997284 4704 generic.go:334] "Generic (PLEG): container finished" podID="fe8e9530-3977-4dc5-abe0-f8c655b58f6a" containerID="fe4b0240cec4d669c84a95a962a07de62164469f517807bf2ab19861e67b2575" exitCode=0 Nov 25 15:38:05 crc kubenswrapper[4704]: I1125 15:38:05.997411 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-gbzgh" event={"ID":"fe8e9530-3977-4dc5-abe0-f8c655b58f6a","Type":"ContainerDied","Data":"fe4b0240cec4d669c84a95a962a07de62164469f517807bf2ab19861e67b2575"} Nov 25 15:38:07 crc kubenswrapper[4704]: I1125 15:38:07.965215 4704 patch_prober.go:28] interesting pod/machine-config-daemon-djz8x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:38:07 crc kubenswrapper[4704]: I1125 15:38:07.966000 4704 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:38:09 crc kubenswrapper[4704]: I1125 15:38:09.694464 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:38:13 crc kubenswrapper[4704]: I1125 15:38:13.759382 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-7pmpw" Nov 25 15:38:13 crc kubenswrapper[4704]: I1125 15:38:13.763152 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-7pmpw" Nov 25 15:38:13 crc kubenswrapper[4704]: I1125 15:38:13.916892 4704 patch_prober.go:28] interesting pod/downloads-7954f5f757-gbzgh container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Nov 25 15:38:13 crc kubenswrapper[4704]: I1125 15:38:13.916960 4704 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gbzgh" podUID="fe8e9530-3977-4dc5-abe0-f8c655b58f6a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Nov 25 15:38:15 crc kubenswrapper[4704]: I1125 15:38:15.647874 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mm4pt" Nov 25 15:38:22 crc kubenswrapper[4704]: I1125 15:38:22.550469 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 15:38:23 crc kubenswrapper[4704]: I1125 15:38:23.917284 4704 patch_prober.go:28] interesting pod/downloads-7954f5f757-gbzgh container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Nov 25 15:38:23 crc kubenswrapper[4704]: I1125 15:38:23.917370 4704 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gbzgh" podUID="fe8e9530-3977-4dc5-abe0-f8c655b58f6a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Nov 25 15:38:25 crc kubenswrapper[4704]: E1125 15:38:25.486424 4704 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 25 15:38:25 crc kubenswrapper[4704]: E1125 15:38:25.487125 4704 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wmfxk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-mx4sl_openshift-marketplace(260ec6a9-8914-49dc-8cd8-95c8fa30a29a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 15:38:25 crc kubenswrapper[4704]: E1125 15:38:25.488617 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-mx4sl" podUID="260ec6a9-8914-49dc-8cd8-95c8fa30a29a" Nov 25 15:38:28 crc kubenswrapper[4704]: E1125 15:38:28.868756 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-mx4sl" podUID="260ec6a9-8914-49dc-8cd8-95c8fa30a29a" Nov 25 15:38:33 crc kubenswrapper[4704]: I1125 15:38:33.916568 4704 patch_prober.go:28] interesting pod/downloads-7954f5f757-gbzgh container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Nov 25 15:38:33 crc kubenswrapper[4704]: I1125 15:38:33.917088 4704 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gbzgh" podUID="fe8e9530-3977-4dc5-abe0-f8c655b58f6a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Nov 25 15:38:34 crc kubenswrapper[4704]: E1125 15:38:34.012157 4704 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 25 15:38:34 crc kubenswrapper[4704]: E1125 15:38:34.012349 4704 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vntgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-d57hr_openshift-marketplace(babe0f08-98f5-4fde-827a-148857d14ebe): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 15:38:34 crc kubenswrapper[4704]: E1125 15:38:34.013447 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-d57hr" podUID="babe0f08-98f5-4fde-827a-148857d14ebe" Nov 25 15:38:35 crc kubenswrapper[4704]: E1125 15:38:35.577837 4704 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 25 15:38:35 crc kubenswrapper[4704]: E1125 15:38:35.578022 4704 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nks6c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-gzmll_openshift-marketplace(5448c847-73a1-4fdc-8d52-d0b0f4ea5129): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 15:38:35 crc kubenswrapper[4704]: E1125 15:38:35.580714 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-gzmll" podUID="5448c847-73a1-4fdc-8d52-d0b0f4ea5129" Nov 25 15:38:37 crc kubenswrapper[4704]: I1125 15:38:37.964510 4704 patch_prober.go:28] interesting pod/machine-config-daemon-djz8x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:38:37 crc kubenswrapper[4704]: I1125 15:38:37.965119 4704 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:38:38 crc kubenswrapper[4704]: E1125 15:38:38.238944 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-gzmll" podUID="5448c847-73a1-4fdc-8d52-d0b0f4ea5129" Nov 25 15:38:38 crc kubenswrapper[4704]: E1125 15:38:38.238984 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-d57hr" podUID="babe0f08-98f5-4fde-827a-148857d14ebe" Nov 25 15:38:38 crc kubenswrapper[4704]: E1125 15:38:38.547530 4704 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 25 15:38:38 crc kubenswrapper[4704]: E1125 15:38:38.548345 4704 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l5gd4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-x9rvl_openshift-marketplace(72d532a8-8d23-4796-a9eb-80a3aeb3acae): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 15:38:38 crc kubenswrapper[4704]: E1125 15:38:38.549584 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-x9rvl" podUID="72d532a8-8d23-4796-a9eb-80a3aeb3acae" Nov 25 15:38:38 crc kubenswrapper[4704]: I1125 15:38:38.659662 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-z6lnx"] Nov 25 15:38:38 crc kubenswrapper[4704]: W1125 15:38:38.663326 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9cf8fad_2f72_4a94_958b_dd58fc76f4df.slice/crio-f11f39d289a677e6c67f70b14793e0d9087f6816c3101fe0a361ac4d0a7e443f WatchSource:0}: Error finding container f11f39d289a677e6c67f70b14793e0d9087f6816c3101fe0a361ac4d0a7e443f: Status 404 returned error can't find the container with id f11f39d289a677e6c67f70b14793e0d9087f6816c3101fe0a361ac4d0a7e443f Nov 25 15:38:39 crc kubenswrapper[4704]: I1125 15:38:39.170812 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-gbzgh" event={"ID":"fe8e9530-3977-4dc5-abe0-f8c655b58f6a","Type":"ContainerStarted","Data":"943d4bada4f6d6ec3a2d0cd2d631e7b6ea86baf723b507c0693d8f6b6ae27926"} Nov 25 15:38:39 crc kubenswrapper[4704]: I1125 15:38:39.172078 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-gbzgh" Nov 25 15:38:39 crc kubenswrapper[4704]: I1125 15:38:39.172695 4704 patch_prober.go:28] interesting pod/downloads-7954f5f757-gbzgh container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Nov 25 15:38:39 crc kubenswrapper[4704]: I1125 15:38:39.172761 4704 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gbzgh" podUID="fe8e9530-3977-4dc5-abe0-f8c655b58f6a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Nov 25 15:38:39 crc kubenswrapper[4704]: I1125 15:38:39.173660 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z6lnx" event={"ID":"b9cf8fad-2f72-4a94-958b-dd58fc76f4df","Type":"ContainerStarted","Data":"f11f39d289a677e6c67f70b14793e0d9087f6816c3101fe0a361ac4d0a7e443f"} Nov 25 15:38:39 crc kubenswrapper[4704]: E1125 15:38:39.175102 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-x9rvl" podUID="72d532a8-8d23-4796-a9eb-80a3aeb3acae" Nov 25 15:38:39 crc kubenswrapper[4704]: E1125 15:38:39.468342 4704 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 25 15:38:39 crc kubenswrapper[4704]: E1125 15:38:39.468531 4704 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jvc5r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-smlnk_openshift-marketplace(09422116-4570-4f3b-bde3-aaebdb318c47): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 15:38:39 crc kubenswrapper[4704]: E1125 15:38:39.469718 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-smlnk" podUID="09422116-4570-4f3b-bde3-aaebdb318c47" Nov 25 15:38:40 crc kubenswrapper[4704]: I1125 15:38:40.182461 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z6lnx" event={"ID":"b9cf8fad-2f72-4a94-958b-dd58fc76f4df","Type":"ContainerStarted","Data":"d429e4c7ff09bc7b99c7cd7640ae8fe68709621e330a0e776485ce5951d10c75"} Nov 25 15:38:40 crc kubenswrapper[4704]: I1125 15:38:40.184974 4704 patch_prober.go:28] interesting pod/downloads-7954f5f757-gbzgh container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Nov 25 15:38:40 crc kubenswrapper[4704]: I1125 15:38:40.185023 4704 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gbzgh" podUID="fe8e9530-3977-4dc5-abe0-f8c655b58f6a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Nov 25 15:38:40 crc kubenswrapper[4704]: E1125 15:38:40.185033 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-smlnk" podUID="09422116-4570-4f3b-bde3-aaebdb318c47" Nov 25 15:38:40 crc kubenswrapper[4704]: E1125 15:38:40.741760 4704 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 25 15:38:40 crc kubenswrapper[4704]: E1125 15:38:40.741994 4704 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5h6r6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-mffhk_openshift-marketplace(36f52ec1-c7de-4345-bc4a-4d8fc6f182fe): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 15:38:40 crc kubenswrapper[4704]: E1125 15:38:40.743238 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-mffhk" podUID="36f52ec1-c7de-4345-bc4a-4d8fc6f182fe" Nov 25 15:38:41 crc kubenswrapper[4704]: I1125 15:38:41.188833 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z6lnx" event={"ID":"b9cf8fad-2f72-4a94-958b-dd58fc76f4df","Type":"ContainerStarted","Data":"b1a9e85e7df4176f4f7180ea2f3d5ad56ddf3d564f574640670357f9dd95190e"} Nov 25 15:38:41 crc kubenswrapper[4704]: I1125 15:38:41.189447 4704 patch_prober.go:28] interesting pod/downloads-7954f5f757-gbzgh container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Nov 25 15:38:41 crc kubenswrapper[4704]: I1125 15:38:41.189489 4704 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gbzgh" podUID="fe8e9530-3977-4dc5-abe0-f8c655b58f6a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Nov 25 15:38:41 crc kubenswrapper[4704]: I1125 15:38:41.206702 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-z6lnx" podStartSLOduration=184.206683882 podStartE2EDuration="3m4.206683882s" podCreationTimestamp="2025-11-25 15:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:38:41.203146314 +0000 UTC m=+207.471420095" watchObservedRunningTime="2025-11-25 15:38:41.206683882 +0000 UTC m=+207.474957663" Nov 25 15:38:41 crc kubenswrapper[4704]: E1125 15:38:41.981960 4704 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 25 15:38:41 crc kubenswrapper[4704]: E1125 15:38:41.982137 4704 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jgrvk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-94qft_openshift-marketplace(cdb3b0c5-af6c-4d36-bba3-5a8419a72107): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 15:38:41 crc kubenswrapper[4704]: E1125 15:38:41.983347 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-94qft" podUID="cdb3b0c5-af6c-4d36-bba3-5a8419a72107" Nov 25 15:38:42 crc kubenswrapper[4704]: E1125 15:38:42.197096 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-94qft" podUID="cdb3b0c5-af6c-4d36-bba3-5a8419a72107" Nov 25 15:38:43 crc kubenswrapper[4704]: I1125 15:38:43.917161 4704 patch_prober.go:28] interesting pod/downloads-7954f5f757-gbzgh container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Nov 25 15:38:43 crc kubenswrapper[4704]: I1125 15:38:43.917872 4704 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-gbzgh" podUID="fe8e9530-3977-4dc5-abe0-f8c655b58f6a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Nov 25 15:38:43 crc kubenswrapper[4704]: I1125 15:38:43.917196 4704 patch_prober.go:28] interesting pod/downloads-7954f5f757-gbzgh container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Nov 25 15:38:43 crc kubenswrapper[4704]: I1125 15:38:43.917973 4704 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gbzgh" podUID="fe8e9530-3977-4dc5-abe0-f8c655b58f6a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Nov 25 15:38:52 crc kubenswrapper[4704]: I1125 15:38:52.253772 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnchd" event={"ID":"3e025f26-2a87-46e7-a152-84793272fb4b","Type":"ContainerStarted","Data":"df31f8b0bb2d12431fd798a0c506daf51186942f6fa160b68a7eaed1bc0921ae"} Nov 25 15:38:52 crc kubenswrapper[4704]: I1125 15:38:52.255806 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mx4sl" event={"ID":"260ec6a9-8914-49dc-8cd8-95c8fa30a29a","Type":"ContainerStarted","Data":"7e3cdf7f960d79183ec1e8e72702ba7dc4d18419e8edfa59a5a5842172fcec54"} Nov 25 15:38:53 crc kubenswrapper[4704]: I1125 15:38:53.930424 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-gbzgh" Nov 25 15:38:56 crc kubenswrapper[4704]: E1125 15:38:56.074756 4704 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e025f26_2a87_46e7_a152_84793272fb4b.slice/crio-df31f8b0bb2d12431fd798a0c506daf51186942f6fa160b68a7eaed1bc0921ae.scope\": RecentStats: unable to find data in memory cache]" Nov 25 15:38:57 crc kubenswrapper[4704]: I1125 15:38:57.282049 4704 generic.go:334] "Generic (PLEG): container finished" podID="3e025f26-2a87-46e7-a152-84793272fb4b" containerID="df31f8b0bb2d12431fd798a0c506daf51186942f6fa160b68a7eaed1bc0921ae" exitCode=0 Nov 25 15:38:57 crc kubenswrapper[4704]: I1125 15:38:57.282161 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnchd" event={"ID":"3e025f26-2a87-46e7-a152-84793272fb4b","Type":"ContainerDied","Data":"df31f8b0bb2d12431fd798a0c506daf51186942f6fa160b68a7eaed1bc0921ae"} Nov 25 15:38:57 crc kubenswrapper[4704]: I1125 15:38:57.283691 4704 generic.go:334] "Generic (PLEG): container finished" podID="260ec6a9-8914-49dc-8cd8-95c8fa30a29a" containerID="7e3cdf7f960d79183ec1e8e72702ba7dc4d18419e8edfa59a5a5842172fcec54" exitCode=0 Nov 25 15:38:57 crc kubenswrapper[4704]: I1125 15:38:57.283725 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mx4sl" event={"ID":"260ec6a9-8914-49dc-8cd8-95c8fa30a29a","Type":"ContainerDied","Data":"7e3cdf7f960d79183ec1e8e72702ba7dc4d18419e8edfa59a5a5842172fcec54"} Nov 25 15:39:07 crc kubenswrapper[4704]: I1125 15:39:07.964412 4704 patch_prober.go:28] interesting pod/machine-config-daemon-djz8x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:39:07 crc kubenswrapper[4704]: I1125 15:39:07.965324 4704 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:39:07 crc kubenswrapper[4704]: I1125 15:39:07.965375 4704 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" Nov 25 15:39:07 crc kubenswrapper[4704]: I1125 15:39:07.966111 4704 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675"} pod="openshift-machine-config-operator/machine-config-daemon-djz8x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 15:39:07 crc kubenswrapper[4704]: I1125 15:39:07.966172 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" containerID="cri-o://7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675" gracePeriod=600 Nov 25 15:39:08 crc kubenswrapper[4704]: I1125 15:39:08.358753 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-94qft" event={"ID":"cdb3b0c5-af6c-4d36-bba3-5a8419a72107","Type":"ContainerStarted","Data":"5ea26e846b666fdbd0d1da572245ad3e4b5709758d8a44776cd67eb606d3c8a8"} Nov 25 15:39:08 crc kubenswrapper[4704]: I1125 15:39:08.361066 4704 generic.go:334] "Generic (PLEG): container finished" podID="72d532a8-8d23-4796-a9eb-80a3aeb3acae" containerID="69a0ecd507d2ab069ac526d6a97b606482aeea78a5d1c7853489fa39042c5f8c" exitCode=0 Nov 25 15:39:08 crc kubenswrapper[4704]: I1125 15:39:08.361118 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9rvl" event={"ID":"72d532a8-8d23-4796-a9eb-80a3aeb3acae","Type":"ContainerDied","Data":"69a0ecd507d2ab069ac526d6a97b606482aeea78a5d1c7853489fa39042c5f8c"} Nov 25 15:39:08 crc kubenswrapper[4704]: I1125 15:39:08.364057 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d57hr" event={"ID":"babe0f08-98f5-4fde-827a-148857d14ebe","Type":"ContainerStarted","Data":"598c71074243848618353669294d7ff850808fdebaac98b749b8d7090caaa4c3"} Nov 25 15:39:08 crc kubenswrapper[4704]: I1125 15:39:08.366118 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smlnk" event={"ID":"09422116-4570-4f3b-bde3-aaebdb318c47","Type":"ContainerStarted","Data":"0f21fda7cd26867522c00030115638a01c7af6e7788125674036939b5bd0318b"} Nov 25 15:39:08 crc kubenswrapper[4704]: I1125 15:39:08.368295 4704 generic.go:334] "Generic (PLEG): container finished" podID="36f52ec1-c7de-4345-bc4a-4d8fc6f182fe" containerID="31f1678ff3eb9f0d68fbd3dd2ebc80d4aca53e6bb0a84252a68d482228d77fca" exitCode=0 Nov 25 15:39:08 crc kubenswrapper[4704]: I1125 15:39:08.368334 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mffhk" event={"ID":"36f52ec1-c7de-4345-bc4a-4d8fc6f182fe","Type":"ContainerDied","Data":"31f1678ff3eb9f0d68fbd3dd2ebc80d4aca53e6bb0a84252a68d482228d77fca"} Nov 25 15:39:08 crc kubenswrapper[4704]: I1125 15:39:08.375199 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mx4sl" event={"ID":"260ec6a9-8914-49dc-8cd8-95c8fa30a29a","Type":"ContainerStarted","Data":"c0c1b8a4b3ec9058a4f9588721d8dd59ee55306bd1432a95d00bc52e270a5e2b"} Nov 25 15:39:08 crc kubenswrapper[4704]: I1125 15:39:08.382854 4704 generic.go:334] "Generic (PLEG): container finished" podID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerID="7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675" exitCode=0 Nov 25 15:39:08 crc kubenswrapper[4704]: I1125 15:39:08.382921 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" event={"ID":"91b52682-d008-4b8a-8bc3-26b032d7dc2c","Type":"ContainerDied","Data":"7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675"} Nov 25 15:39:08 crc kubenswrapper[4704]: I1125 15:39:08.382951 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" event={"ID":"91b52682-d008-4b8a-8bc3-26b032d7dc2c","Type":"ContainerStarted","Data":"bf646e20b03b3390aa256db5e03bee5d833cba5b9a37144d98eae89a8816d8d1"} Nov 25 15:39:08 crc kubenswrapper[4704]: I1125 15:39:08.393487 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnchd" event={"ID":"3e025f26-2a87-46e7-a152-84793272fb4b","Type":"ContainerStarted","Data":"f8988a351b10566bc8fcaeba5c427e12f28b28e5a4802d89eafdc57fb6f86f57"} Nov 25 15:39:08 crc kubenswrapper[4704]: I1125 15:39:08.399257 4704 generic.go:334] "Generic (PLEG): container finished" podID="5448c847-73a1-4fdc-8d52-d0b0f4ea5129" containerID="5c4d31000afa52a372375f02fcede828c3bdf9308eccb312ed8599f35d42f098" exitCode=0 Nov 25 15:39:08 crc kubenswrapper[4704]: I1125 15:39:08.399315 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzmll" event={"ID":"5448c847-73a1-4fdc-8d52-d0b0f4ea5129","Type":"ContainerDied","Data":"5c4d31000afa52a372375f02fcede828c3bdf9308eccb312ed8599f35d42f098"} Nov 25 15:39:08 crc kubenswrapper[4704]: I1125 15:39:08.475495 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vnchd" podStartSLOduration=3.692100345 podStartE2EDuration="1m22.475475341s" podCreationTimestamp="2025-11-25 15:37:46 +0000 UTC" firstStartedPulling="2025-11-25 15:37:48.806044683 +0000 UTC m=+155.074318464" lastFinishedPulling="2025-11-25 15:39:07.589419679 +0000 UTC m=+233.857693460" observedRunningTime="2025-11-25 15:39:08.47281768 +0000 UTC m=+234.741091461" watchObservedRunningTime="2025-11-25 15:39:08.475475341 +0000 UTC m=+234.743749122" Nov 25 15:39:08 crc kubenswrapper[4704]: I1125 15:39:08.515230 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mx4sl" podStartSLOduration=4.659763774 podStartE2EDuration="1m23.515206638s" podCreationTimestamp="2025-11-25 15:37:45 +0000 UTC" firstStartedPulling="2025-11-25 15:37:48.818153034 +0000 UTC m=+155.086426805" lastFinishedPulling="2025-11-25 15:39:07.673595888 +0000 UTC m=+233.941869669" observedRunningTime="2025-11-25 15:39:08.511126703 +0000 UTC m=+234.779400484" watchObservedRunningTime="2025-11-25 15:39:08.515206638 +0000 UTC m=+234.783480419" Nov 25 15:39:09 crc kubenswrapper[4704]: I1125 15:39:09.408265 4704 generic.go:334] "Generic (PLEG): container finished" podID="babe0f08-98f5-4fde-827a-148857d14ebe" containerID="598c71074243848618353669294d7ff850808fdebaac98b749b8d7090caaa4c3" exitCode=0 Nov 25 15:39:09 crc kubenswrapper[4704]: I1125 15:39:09.408375 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d57hr" event={"ID":"babe0f08-98f5-4fde-827a-148857d14ebe","Type":"ContainerDied","Data":"598c71074243848618353669294d7ff850808fdebaac98b749b8d7090caaa4c3"} Nov 25 15:39:09 crc kubenswrapper[4704]: I1125 15:39:09.413872 4704 generic.go:334] "Generic (PLEG): container finished" podID="09422116-4570-4f3b-bde3-aaebdb318c47" containerID="0f21fda7cd26867522c00030115638a01c7af6e7788125674036939b5bd0318b" exitCode=0 Nov 25 15:39:09 crc kubenswrapper[4704]: I1125 15:39:09.413953 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smlnk" event={"ID":"09422116-4570-4f3b-bde3-aaebdb318c47","Type":"ContainerDied","Data":"0f21fda7cd26867522c00030115638a01c7af6e7788125674036939b5bd0318b"} Nov 25 15:39:09 crc kubenswrapper[4704]: I1125 15:39:09.415988 4704 generic.go:334] "Generic (PLEG): container finished" podID="cdb3b0c5-af6c-4d36-bba3-5a8419a72107" containerID="5ea26e846b666fdbd0d1da572245ad3e4b5709758d8a44776cd67eb606d3c8a8" exitCode=0 Nov 25 15:39:09 crc kubenswrapper[4704]: I1125 15:39:09.416027 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-94qft" event={"ID":"cdb3b0c5-af6c-4d36-bba3-5a8419a72107","Type":"ContainerDied","Data":"5ea26e846b666fdbd0d1da572245ad3e4b5709758d8a44776cd67eb606d3c8a8"} Nov 25 15:39:10 crc kubenswrapper[4704]: I1125 15:39:10.424529 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d57hr" event={"ID":"babe0f08-98f5-4fde-827a-148857d14ebe","Type":"ContainerStarted","Data":"ba09015ba8c7d8a74a422af49cd1b9b24702456343aa7130512a223e529f593a"} Nov 25 15:39:10 crc kubenswrapper[4704]: I1125 15:39:10.425932 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smlnk" event={"ID":"09422116-4570-4f3b-bde3-aaebdb318c47","Type":"ContainerStarted","Data":"a29ffc387d11e823ecaa7ff86ac49234cae3877c97949819c9b7fb1e9fb03c08"} Nov 25 15:39:10 crc kubenswrapper[4704]: I1125 15:39:10.429272 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mffhk" event={"ID":"36f52ec1-c7de-4345-bc4a-4d8fc6f182fe","Type":"ContainerStarted","Data":"ab07af679b445dbd637a942918390ee20861a7ec64c1124f07af2dc98b273864"} Nov 25 15:39:10 crc kubenswrapper[4704]: I1125 15:39:10.432225 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzmll" event={"ID":"5448c847-73a1-4fdc-8d52-d0b0f4ea5129","Type":"ContainerStarted","Data":"99a5c5d6302287fc9aa215cdbf1c2ca86fb5a656d2493e049f1c4cadfd816ad1"} Nov 25 15:39:10 crc kubenswrapper[4704]: I1125 15:39:10.434193 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9rvl" event={"ID":"72d532a8-8d23-4796-a9eb-80a3aeb3acae","Type":"ContainerStarted","Data":"f4552af765c1af0f3f5edc7215788e8556670a1b7730b4f502ce4f48dbd1a7d2"} Nov 25 15:39:10 crc kubenswrapper[4704]: I1125 15:39:10.453748 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d57hr" podStartSLOduration=2.186635925 podStartE2EDuration="1m21.453721108s" podCreationTimestamp="2025-11-25 15:37:49 +0000 UTC" firstStartedPulling="2025-11-25 15:37:50.862363631 +0000 UTC m=+157.130637402" lastFinishedPulling="2025-11-25 15:39:10.129448814 +0000 UTC m=+236.397722585" observedRunningTime="2025-11-25 15:39:10.449584651 +0000 UTC m=+236.717858432" watchObservedRunningTime="2025-11-25 15:39:10.453721108 +0000 UTC m=+236.721994889" Nov 25 15:39:10 crc kubenswrapper[4704]: I1125 15:39:10.469610 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gzmll" podStartSLOduration=4.473791501 podStartE2EDuration="1m23.469588114s" podCreationTimestamp="2025-11-25 15:37:47 +0000 UTC" firstStartedPulling="2025-11-25 15:37:50.856014717 +0000 UTC m=+157.124288498" lastFinishedPulling="2025-11-25 15:39:09.85181132 +0000 UTC m=+236.120085111" observedRunningTime="2025-11-25 15:39:10.46752646 +0000 UTC m=+236.735800241" watchObservedRunningTime="2025-11-25 15:39:10.469588114 +0000 UTC m=+236.737861895" Nov 25 15:39:10 crc kubenswrapper[4704]: I1125 15:39:10.524235 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x9rvl" podStartSLOduration=3.534110334 podStartE2EDuration="1m25.524210517s" podCreationTimestamp="2025-11-25 15:37:45 +0000 UTC" firstStartedPulling="2025-11-25 15:37:47.718638674 +0000 UTC m=+153.986912455" lastFinishedPulling="2025-11-25 15:39:09.708738857 +0000 UTC m=+235.977012638" observedRunningTime="2025-11-25 15:39:10.493013521 +0000 UTC m=+236.761287312" watchObservedRunningTime="2025-11-25 15:39:10.524210517 +0000 UTC m=+236.792484298" Nov 25 15:39:10 crc kubenswrapper[4704]: I1125 15:39:10.526413 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mffhk" podStartSLOduration=4.671004991 podStartE2EDuration="1m23.526407164s" podCreationTimestamp="2025-11-25 15:37:47 +0000 UTC" firstStartedPulling="2025-11-25 15:37:50.870665965 +0000 UTC m=+157.138939746" lastFinishedPulling="2025-11-25 15:39:09.726068138 +0000 UTC m=+235.994341919" observedRunningTime="2025-11-25 15:39:10.523924728 +0000 UTC m=+236.792198519" watchObservedRunningTime="2025-11-25 15:39:10.526407164 +0000 UTC m=+236.794680945" Nov 25 15:39:10 crc kubenswrapper[4704]: I1125 15:39:10.552565 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-smlnk" podStartSLOduration=3.048137628 podStartE2EDuration="1m25.552538075s" podCreationTimestamp="2025-11-25 15:37:45 +0000 UTC" firstStartedPulling="2025-11-25 15:37:47.660984868 +0000 UTC m=+153.929258649" lastFinishedPulling="2025-11-25 15:39:10.165385315 +0000 UTC m=+236.433659096" observedRunningTime="2025-11-25 15:39:10.551211934 +0000 UTC m=+236.819485715" watchObservedRunningTime="2025-11-25 15:39:10.552538075 +0000 UTC m=+236.820811876" Nov 25 15:39:11 crc kubenswrapper[4704]: I1125 15:39:11.452448 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-94qft" event={"ID":"cdb3b0c5-af6c-4d36-bba3-5a8419a72107","Type":"ContainerStarted","Data":"778b173d28098e0b96b3b2c5e7567794f5752393c66df47e0144e1c225f5f314"} Nov 25 15:39:11 crc kubenswrapper[4704]: I1125 15:39:11.481805 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-94qft" podStartSLOduration=4.003577199 podStartE2EDuration="1m23.481774669s" podCreationTimestamp="2025-11-25 15:37:48 +0000 UTC" firstStartedPulling="2025-11-25 15:37:50.850659663 +0000 UTC m=+157.118933454" lastFinishedPulling="2025-11-25 15:39:10.328857143 +0000 UTC m=+236.597130924" observedRunningTime="2025-11-25 15:39:11.478632273 +0000 UTC m=+237.746906054" watchObservedRunningTime="2025-11-25 15:39:11.481774669 +0000 UTC m=+237.750048450" Nov 25 15:39:15 crc kubenswrapper[4704]: I1125 15:39:15.947991 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-smlnk" Nov 25 15:39:15 crc kubenswrapper[4704]: I1125 15:39:15.948851 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-smlnk" Nov 25 15:39:16 crc kubenswrapper[4704]: I1125 15:39:16.212607 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mx4sl" Nov 25 15:39:16 crc kubenswrapper[4704]: I1125 15:39:16.213071 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mx4sl" Nov 25 15:39:16 crc kubenswrapper[4704]: I1125 15:39:16.367810 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-smlnk" Nov 25 15:39:16 crc kubenswrapper[4704]: I1125 15:39:16.367967 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mx4sl" Nov 25 15:39:16 crc kubenswrapper[4704]: I1125 15:39:16.455708 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x9rvl" Nov 25 15:39:16 crc kubenswrapper[4704]: I1125 15:39:16.455862 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x9rvl" Nov 25 15:39:16 crc kubenswrapper[4704]: I1125 15:39:16.494747 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x9rvl" Nov 25 15:39:16 crc kubenswrapper[4704]: I1125 15:39:16.523162 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-smlnk" Nov 25 15:39:16 crc kubenswrapper[4704]: I1125 15:39:16.527941 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mx4sl" Nov 25 15:39:16 crc kubenswrapper[4704]: I1125 15:39:16.577529 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vnchd" Nov 25 15:39:16 crc kubenswrapper[4704]: I1125 15:39:16.577592 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vnchd" Nov 25 15:39:16 crc kubenswrapper[4704]: I1125 15:39:16.621878 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vnchd" Nov 25 15:39:17 crc kubenswrapper[4704]: I1125 15:39:17.541650 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vnchd" Nov 25 15:39:17 crc kubenswrapper[4704]: I1125 15:39:17.544071 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x9rvl" Nov 25 15:39:17 crc kubenswrapper[4704]: I1125 15:39:17.933958 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mffhk" Nov 25 15:39:17 crc kubenswrapper[4704]: I1125 15:39:17.934272 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mffhk" Nov 25 15:39:17 crc kubenswrapper[4704]: I1125 15:39:17.978560 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mffhk" Nov 25 15:39:18 crc kubenswrapper[4704]: I1125 15:39:18.299276 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gzmll" Nov 25 15:39:18 crc kubenswrapper[4704]: I1125 15:39:18.299353 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gzmll" Nov 25 15:39:18 crc kubenswrapper[4704]: I1125 15:39:18.348068 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gzmll" Nov 25 15:39:18 crc kubenswrapper[4704]: I1125 15:39:18.538855 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mffhk" Nov 25 15:39:18 crc kubenswrapper[4704]: I1125 15:39:18.541115 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gzmll" Nov 25 15:39:18 crc kubenswrapper[4704]: I1125 15:39:18.606053 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x9rvl"] Nov 25 15:39:18 crc kubenswrapper[4704]: I1125 15:39:18.806853 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vnchd"] Nov 25 15:39:19 crc kubenswrapper[4704]: I1125 15:39:19.118659 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-94qft" Nov 25 15:39:19 crc kubenswrapper[4704]: I1125 15:39:19.118807 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-94qft" Nov 25 15:39:19 crc kubenswrapper[4704]: I1125 15:39:19.175175 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-94qft" Nov 25 15:39:19 crc kubenswrapper[4704]: I1125 15:39:19.487517 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d57hr" Nov 25 15:39:19 crc kubenswrapper[4704]: I1125 15:39:19.488133 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d57hr" Nov 25 15:39:19 crc kubenswrapper[4704]: I1125 15:39:19.501457 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x9rvl" podUID="72d532a8-8d23-4796-a9eb-80a3aeb3acae" containerName="registry-server" containerID="cri-o://f4552af765c1af0f3f5edc7215788e8556670a1b7730b4f502ce4f48dbd1a7d2" gracePeriod=2 Nov 25 15:39:19 crc kubenswrapper[4704]: I1125 15:39:19.501507 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vnchd" podUID="3e025f26-2a87-46e7-a152-84793272fb4b" containerName="registry-server" containerID="cri-o://f8988a351b10566bc8fcaeba5c427e12f28b28e5a4802d89eafdc57fb6f86f57" gracePeriod=2 Nov 25 15:39:19 crc kubenswrapper[4704]: I1125 15:39:19.541470 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d57hr" Nov 25 15:39:19 crc kubenswrapper[4704]: I1125 15:39:19.541631 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-94qft" Nov 25 15:39:19 crc kubenswrapper[4704]: I1125 15:39:19.596973 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d57hr" Nov 25 15:39:19 crc kubenswrapper[4704]: I1125 15:39:19.863194 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vnchd" Nov 25 15:39:19 crc kubenswrapper[4704]: I1125 15:39:19.923889 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x9rvl" Nov 25 15:39:19 crc kubenswrapper[4704]: I1125 15:39:19.959943 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e025f26-2a87-46e7-a152-84793272fb4b-utilities\") pod \"3e025f26-2a87-46e7-a152-84793272fb4b\" (UID: \"3e025f26-2a87-46e7-a152-84793272fb4b\") " Nov 25 15:39:19 crc kubenswrapper[4704]: I1125 15:39:19.960132 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrqdb\" (UniqueName: \"kubernetes.io/projected/3e025f26-2a87-46e7-a152-84793272fb4b-kube-api-access-vrqdb\") pod \"3e025f26-2a87-46e7-a152-84793272fb4b\" (UID: \"3e025f26-2a87-46e7-a152-84793272fb4b\") " Nov 25 15:39:19 crc kubenswrapper[4704]: I1125 15:39:19.960911 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e025f26-2a87-46e7-a152-84793272fb4b-utilities" (OuterVolumeSpecName: "utilities") pod "3e025f26-2a87-46e7-a152-84793272fb4b" (UID: "3e025f26-2a87-46e7-a152-84793272fb4b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:39:19 crc kubenswrapper[4704]: I1125 15:39:19.961600 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e025f26-2a87-46e7-a152-84793272fb4b-catalog-content\") pod \"3e025f26-2a87-46e7-a152-84793272fb4b\" (UID: \"3e025f26-2a87-46e7-a152-84793272fb4b\") " Nov 25 15:39:19 crc kubenswrapper[4704]: I1125 15:39:19.962053 4704 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e025f26-2a87-46e7-a152-84793272fb4b-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:39:19 crc kubenswrapper[4704]: I1125 15:39:19.966725 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e025f26-2a87-46e7-a152-84793272fb4b-kube-api-access-vrqdb" (OuterVolumeSpecName: "kube-api-access-vrqdb") pod "3e025f26-2a87-46e7-a152-84793272fb4b" (UID: "3e025f26-2a87-46e7-a152-84793272fb4b"). InnerVolumeSpecName "kube-api-access-vrqdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.010214 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e025f26-2a87-46e7-a152-84793272fb4b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3e025f26-2a87-46e7-a152-84793272fb4b" (UID: "3e025f26-2a87-46e7-a152-84793272fb4b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.063220 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72d532a8-8d23-4796-a9eb-80a3aeb3acae-utilities\") pod \"72d532a8-8d23-4796-a9eb-80a3aeb3acae\" (UID: \"72d532a8-8d23-4796-a9eb-80a3aeb3acae\") " Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.063324 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72d532a8-8d23-4796-a9eb-80a3aeb3acae-catalog-content\") pod \"72d532a8-8d23-4796-a9eb-80a3aeb3acae\" (UID: \"72d532a8-8d23-4796-a9eb-80a3aeb3acae\") " Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.063395 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5gd4\" (UniqueName: \"kubernetes.io/projected/72d532a8-8d23-4796-a9eb-80a3aeb3acae-kube-api-access-l5gd4\") pod \"72d532a8-8d23-4796-a9eb-80a3aeb3acae\" (UID: \"72d532a8-8d23-4796-a9eb-80a3aeb3acae\") " Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.063697 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrqdb\" (UniqueName: \"kubernetes.io/projected/3e025f26-2a87-46e7-a152-84793272fb4b-kube-api-access-vrqdb\") on node \"crc\" DevicePath \"\"" Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.063723 4704 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e025f26-2a87-46e7-a152-84793272fb4b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.064199 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72d532a8-8d23-4796-a9eb-80a3aeb3acae-utilities" (OuterVolumeSpecName: "utilities") pod "72d532a8-8d23-4796-a9eb-80a3aeb3acae" (UID: "72d532a8-8d23-4796-a9eb-80a3aeb3acae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.071233 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72d532a8-8d23-4796-a9eb-80a3aeb3acae-kube-api-access-l5gd4" (OuterVolumeSpecName: "kube-api-access-l5gd4") pod "72d532a8-8d23-4796-a9eb-80a3aeb3acae" (UID: "72d532a8-8d23-4796-a9eb-80a3aeb3acae"). InnerVolumeSpecName "kube-api-access-l5gd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.117029 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72d532a8-8d23-4796-a9eb-80a3aeb3acae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72d532a8-8d23-4796-a9eb-80a3aeb3acae" (UID: "72d532a8-8d23-4796-a9eb-80a3aeb3acae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.165770 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5gd4\" (UniqueName: \"kubernetes.io/projected/72d532a8-8d23-4796-a9eb-80a3aeb3acae-kube-api-access-l5gd4\") on node \"crc\" DevicePath \"\"" Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.165840 4704 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72d532a8-8d23-4796-a9eb-80a3aeb3acae-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.165858 4704 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72d532a8-8d23-4796-a9eb-80a3aeb3acae-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.506862 4704 generic.go:334] "Generic (PLEG): container finished" podID="72d532a8-8d23-4796-a9eb-80a3aeb3acae" containerID="f4552af765c1af0f3f5edc7215788e8556670a1b7730b4f502ce4f48dbd1a7d2" exitCode=0 Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.506931 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9rvl" event={"ID":"72d532a8-8d23-4796-a9eb-80a3aeb3acae","Type":"ContainerDied","Data":"f4552af765c1af0f3f5edc7215788e8556670a1b7730b4f502ce4f48dbd1a7d2"} Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.506965 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9rvl" event={"ID":"72d532a8-8d23-4796-a9eb-80a3aeb3acae","Type":"ContainerDied","Data":"97100f37f45484a09a4acee208358ae0dde49f4b1b520193a3d0f01e1160d860"} Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.506987 4704 scope.go:117] "RemoveContainer" containerID="f4552af765c1af0f3f5edc7215788e8556670a1b7730b4f502ce4f48dbd1a7d2" Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.507121 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x9rvl" Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.516302 4704 generic.go:334] "Generic (PLEG): container finished" podID="3e025f26-2a87-46e7-a152-84793272fb4b" containerID="f8988a351b10566bc8fcaeba5c427e12f28b28e5a4802d89eafdc57fb6f86f57" exitCode=0 Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.516911 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnchd" event={"ID":"3e025f26-2a87-46e7-a152-84793272fb4b","Type":"ContainerDied","Data":"f8988a351b10566bc8fcaeba5c427e12f28b28e5a4802d89eafdc57fb6f86f57"} Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.516962 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnchd" event={"ID":"3e025f26-2a87-46e7-a152-84793272fb4b","Type":"ContainerDied","Data":"ac0aaaf195cc5cbbd51e5bb429c20e680f844dd33939a2a66958b62af85faf15"} Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.517078 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vnchd" Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.529057 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x9rvl"] Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.534398 4704 scope.go:117] "RemoveContainer" containerID="69a0ecd507d2ab069ac526d6a97b606482aeea78a5d1c7853489fa39042c5f8c" Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.538573 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x9rvl"] Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.544314 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vnchd"] Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.548410 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vnchd"] Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.558312 4704 scope.go:117] "RemoveContainer" containerID="68704e5ca5830cab0bb46c6e10d7f6dcc452beaa03cac001b12d02a8d99dd051" Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.573079 4704 scope.go:117] "RemoveContainer" containerID="f4552af765c1af0f3f5edc7215788e8556670a1b7730b4f502ce4f48dbd1a7d2" Nov 25 15:39:20 crc kubenswrapper[4704]: E1125 15:39:20.573630 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4552af765c1af0f3f5edc7215788e8556670a1b7730b4f502ce4f48dbd1a7d2\": container with ID starting with f4552af765c1af0f3f5edc7215788e8556670a1b7730b4f502ce4f48dbd1a7d2 not found: ID does not exist" containerID="f4552af765c1af0f3f5edc7215788e8556670a1b7730b4f502ce4f48dbd1a7d2" Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.573667 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4552af765c1af0f3f5edc7215788e8556670a1b7730b4f502ce4f48dbd1a7d2"} err="failed to get container status \"f4552af765c1af0f3f5edc7215788e8556670a1b7730b4f502ce4f48dbd1a7d2\": rpc error: code = NotFound desc = could not find container \"f4552af765c1af0f3f5edc7215788e8556670a1b7730b4f502ce4f48dbd1a7d2\": container with ID starting with f4552af765c1af0f3f5edc7215788e8556670a1b7730b4f502ce4f48dbd1a7d2 not found: ID does not exist" Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.573694 4704 scope.go:117] "RemoveContainer" containerID="69a0ecd507d2ab069ac526d6a97b606482aeea78a5d1c7853489fa39042c5f8c" Nov 25 15:39:20 crc kubenswrapper[4704]: E1125 15:39:20.574227 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69a0ecd507d2ab069ac526d6a97b606482aeea78a5d1c7853489fa39042c5f8c\": container with ID starting with 69a0ecd507d2ab069ac526d6a97b606482aeea78a5d1c7853489fa39042c5f8c not found: ID does not exist" containerID="69a0ecd507d2ab069ac526d6a97b606482aeea78a5d1c7853489fa39042c5f8c" Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.574269 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69a0ecd507d2ab069ac526d6a97b606482aeea78a5d1c7853489fa39042c5f8c"} err="failed to get container status \"69a0ecd507d2ab069ac526d6a97b606482aeea78a5d1c7853489fa39042c5f8c\": rpc error: code = NotFound desc = could not find container \"69a0ecd507d2ab069ac526d6a97b606482aeea78a5d1c7853489fa39042c5f8c\": container with ID starting with 69a0ecd507d2ab069ac526d6a97b606482aeea78a5d1c7853489fa39042c5f8c not found: ID does not exist" Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.574288 4704 scope.go:117] "RemoveContainer" containerID="68704e5ca5830cab0bb46c6e10d7f6dcc452beaa03cac001b12d02a8d99dd051" Nov 25 15:39:20 crc kubenswrapper[4704]: E1125 15:39:20.574556 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68704e5ca5830cab0bb46c6e10d7f6dcc452beaa03cac001b12d02a8d99dd051\": container with ID starting with 68704e5ca5830cab0bb46c6e10d7f6dcc452beaa03cac001b12d02a8d99dd051 not found: ID does not exist" containerID="68704e5ca5830cab0bb46c6e10d7f6dcc452beaa03cac001b12d02a8d99dd051" Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.574583 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68704e5ca5830cab0bb46c6e10d7f6dcc452beaa03cac001b12d02a8d99dd051"} err="failed to get container status \"68704e5ca5830cab0bb46c6e10d7f6dcc452beaa03cac001b12d02a8d99dd051\": rpc error: code = NotFound desc = could not find container \"68704e5ca5830cab0bb46c6e10d7f6dcc452beaa03cac001b12d02a8d99dd051\": container with ID starting with 68704e5ca5830cab0bb46c6e10d7f6dcc452beaa03cac001b12d02a8d99dd051 not found: ID does not exist" Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.574598 4704 scope.go:117] "RemoveContainer" containerID="f8988a351b10566bc8fcaeba5c427e12f28b28e5a4802d89eafdc57fb6f86f57" Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.589776 4704 scope.go:117] "RemoveContainer" containerID="df31f8b0bb2d12431fd798a0c506daf51186942f6fa160b68a7eaed1bc0921ae" Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.605408 4704 scope.go:117] "RemoveContainer" containerID="09f6279cf74281b27542a52cee0aae733699067167e89815c8b289249ce1d2fc" Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.624137 4704 scope.go:117] "RemoveContainer" containerID="f8988a351b10566bc8fcaeba5c427e12f28b28e5a4802d89eafdc57fb6f86f57" Nov 25 15:39:20 crc kubenswrapper[4704]: E1125 15:39:20.624679 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8988a351b10566bc8fcaeba5c427e12f28b28e5a4802d89eafdc57fb6f86f57\": container with ID starting with f8988a351b10566bc8fcaeba5c427e12f28b28e5a4802d89eafdc57fb6f86f57 not found: ID does not exist" containerID="f8988a351b10566bc8fcaeba5c427e12f28b28e5a4802d89eafdc57fb6f86f57" Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.624724 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8988a351b10566bc8fcaeba5c427e12f28b28e5a4802d89eafdc57fb6f86f57"} err="failed to get container status \"f8988a351b10566bc8fcaeba5c427e12f28b28e5a4802d89eafdc57fb6f86f57\": rpc error: code = NotFound desc = could not find container \"f8988a351b10566bc8fcaeba5c427e12f28b28e5a4802d89eafdc57fb6f86f57\": container with ID starting with f8988a351b10566bc8fcaeba5c427e12f28b28e5a4802d89eafdc57fb6f86f57 not found: ID does not exist" Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.624759 4704 scope.go:117] "RemoveContainer" containerID="df31f8b0bb2d12431fd798a0c506daf51186942f6fa160b68a7eaed1bc0921ae" Nov 25 15:39:20 crc kubenswrapper[4704]: E1125 15:39:20.625227 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df31f8b0bb2d12431fd798a0c506daf51186942f6fa160b68a7eaed1bc0921ae\": container with ID starting with df31f8b0bb2d12431fd798a0c506daf51186942f6fa160b68a7eaed1bc0921ae not found: ID does not exist" containerID="df31f8b0bb2d12431fd798a0c506daf51186942f6fa160b68a7eaed1bc0921ae" Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.625264 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df31f8b0bb2d12431fd798a0c506daf51186942f6fa160b68a7eaed1bc0921ae"} err="failed to get container status \"df31f8b0bb2d12431fd798a0c506daf51186942f6fa160b68a7eaed1bc0921ae\": rpc error: code = NotFound desc = could not find container \"df31f8b0bb2d12431fd798a0c506daf51186942f6fa160b68a7eaed1bc0921ae\": container with ID starting with df31f8b0bb2d12431fd798a0c506daf51186942f6fa160b68a7eaed1bc0921ae not found: ID does not exist" Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.625295 4704 scope.go:117] "RemoveContainer" containerID="09f6279cf74281b27542a52cee0aae733699067167e89815c8b289249ce1d2fc" Nov 25 15:39:20 crc kubenswrapper[4704]: E1125 15:39:20.625580 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09f6279cf74281b27542a52cee0aae733699067167e89815c8b289249ce1d2fc\": container with ID starting with 09f6279cf74281b27542a52cee0aae733699067167e89815c8b289249ce1d2fc not found: ID does not exist" containerID="09f6279cf74281b27542a52cee0aae733699067167e89815c8b289249ce1d2fc" Nov 25 15:39:20 crc kubenswrapper[4704]: I1125 15:39:20.625603 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09f6279cf74281b27542a52cee0aae733699067167e89815c8b289249ce1d2fc"} err="failed to get container status \"09f6279cf74281b27542a52cee0aae733699067167e89815c8b289249ce1d2fc\": rpc error: code = NotFound desc = could not find container \"09f6279cf74281b27542a52cee0aae733699067167e89815c8b289249ce1d2fc\": container with ID starting with 09f6279cf74281b27542a52cee0aae733699067167e89815c8b289249ce1d2fc not found: ID does not exist" Nov 25 15:39:21 crc kubenswrapper[4704]: I1125 15:39:21.007149 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzmll"] Nov 25 15:39:21 crc kubenswrapper[4704]: I1125 15:39:21.008391 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gzmll" podUID="5448c847-73a1-4fdc-8d52-d0b0f4ea5129" containerName="registry-server" containerID="cri-o://99a5c5d6302287fc9aa215cdbf1c2ca86fb5a656d2493e049f1c4cadfd816ad1" gracePeriod=2 Nov 25 15:39:21 crc kubenswrapper[4704]: I1125 15:39:21.384431 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzmll" Nov 25 15:39:21 crc kubenswrapper[4704]: I1125 15:39:21.489508 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5448c847-73a1-4fdc-8d52-d0b0f4ea5129-utilities\") pod \"5448c847-73a1-4fdc-8d52-d0b0f4ea5129\" (UID: \"5448c847-73a1-4fdc-8d52-d0b0f4ea5129\") " Nov 25 15:39:21 crc kubenswrapper[4704]: I1125 15:39:21.489589 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nks6c\" (UniqueName: \"kubernetes.io/projected/5448c847-73a1-4fdc-8d52-d0b0f4ea5129-kube-api-access-nks6c\") pod \"5448c847-73a1-4fdc-8d52-d0b0f4ea5129\" (UID: \"5448c847-73a1-4fdc-8d52-d0b0f4ea5129\") " Nov 25 15:39:21 crc kubenswrapper[4704]: I1125 15:39:21.489611 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5448c847-73a1-4fdc-8d52-d0b0f4ea5129-catalog-content\") pod \"5448c847-73a1-4fdc-8d52-d0b0f4ea5129\" (UID: \"5448c847-73a1-4fdc-8d52-d0b0f4ea5129\") " Nov 25 15:39:21 crc kubenswrapper[4704]: I1125 15:39:21.490802 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5448c847-73a1-4fdc-8d52-d0b0f4ea5129-utilities" (OuterVolumeSpecName: "utilities") pod "5448c847-73a1-4fdc-8d52-d0b0f4ea5129" (UID: "5448c847-73a1-4fdc-8d52-d0b0f4ea5129"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:39:21 crc kubenswrapper[4704]: I1125 15:39:21.496331 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5448c847-73a1-4fdc-8d52-d0b0f4ea5129-kube-api-access-nks6c" (OuterVolumeSpecName: "kube-api-access-nks6c") pod "5448c847-73a1-4fdc-8d52-d0b0f4ea5129" (UID: "5448c847-73a1-4fdc-8d52-d0b0f4ea5129"). InnerVolumeSpecName "kube-api-access-nks6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:39:21 crc kubenswrapper[4704]: I1125 15:39:21.505216 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5448c847-73a1-4fdc-8d52-d0b0f4ea5129-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5448c847-73a1-4fdc-8d52-d0b0f4ea5129" (UID: "5448c847-73a1-4fdc-8d52-d0b0f4ea5129"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:39:21 crc kubenswrapper[4704]: I1125 15:39:21.525219 4704 generic.go:334] "Generic (PLEG): container finished" podID="5448c847-73a1-4fdc-8d52-d0b0f4ea5129" containerID="99a5c5d6302287fc9aa215cdbf1c2ca86fb5a656d2493e049f1c4cadfd816ad1" exitCode=0 Nov 25 15:39:21 crc kubenswrapper[4704]: I1125 15:39:21.525287 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzmll" Nov 25 15:39:21 crc kubenswrapper[4704]: I1125 15:39:21.525318 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzmll" event={"ID":"5448c847-73a1-4fdc-8d52-d0b0f4ea5129","Type":"ContainerDied","Data":"99a5c5d6302287fc9aa215cdbf1c2ca86fb5a656d2493e049f1c4cadfd816ad1"} Nov 25 15:39:21 crc kubenswrapper[4704]: I1125 15:39:21.525389 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzmll" event={"ID":"5448c847-73a1-4fdc-8d52-d0b0f4ea5129","Type":"ContainerDied","Data":"d58334644cb8089ee185198f8eaedeaee32b42db214f48db698edbd6ac3783bd"} Nov 25 15:39:21 crc kubenswrapper[4704]: I1125 15:39:21.525416 4704 scope.go:117] "RemoveContainer" containerID="99a5c5d6302287fc9aa215cdbf1c2ca86fb5a656d2493e049f1c4cadfd816ad1" Nov 25 15:39:21 crc kubenswrapper[4704]: I1125 15:39:21.546245 4704 scope.go:117] "RemoveContainer" containerID="5c4d31000afa52a372375f02fcede828c3bdf9308eccb312ed8599f35d42f098" Nov 25 15:39:21 crc kubenswrapper[4704]: I1125 15:39:21.558078 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzmll"] Nov 25 15:39:21 crc kubenswrapper[4704]: I1125 15:39:21.560509 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzmll"] Nov 25 15:39:21 crc kubenswrapper[4704]: I1125 15:39:21.581407 4704 scope.go:117] "RemoveContainer" containerID="d8986572938f73df99c95558ffc0e1b9eb720c7ea80f731b2825b691d5c53b50" Nov 25 15:39:21 crc kubenswrapper[4704]: I1125 15:39:21.591726 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nks6c\" (UniqueName: \"kubernetes.io/projected/5448c847-73a1-4fdc-8d52-d0b0f4ea5129-kube-api-access-nks6c\") on node \"crc\" DevicePath \"\"" Nov 25 15:39:21 crc kubenswrapper[4704]: I1125 15:39:21.591762 4704 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5448c847-73a1-4fdc-8d52-d0b0f4ea5129-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:39:21 crc kubenswrapper[4704]: I1125 15:39:21.591777 4704 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5448c847-73a1-4fdc-8d52-d0b0f4ea5129-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:39:21 crc kubenswrapper[4704]: I1125 15:39:21.595896 4704 scope.go:117] "RemoveContainer" containerID="99a5c5d6302287fc9aa215cdbf1c2ca86fb5a656d2493e049f1c4cadfd816ad1" Nov 25 15:39:21 crc kubenswrapper[4704]: E1125 15:39:21.596765 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99a5c5d6302287fc9aa215cdbf1c2ca86fb5a656d2493e049f1c4cadfd816ad1\": container with ID starting with 99a5c5d6302287fc9aa215cdbf1c2ca86fb5a656d2493e049f1c4cadfd816ad1 not found: ID does not exist" containerID="99a5c5d6302287fc9aa215cdbf1c2ca86fb5a656d2493e049f1c4cadfd816ad1" Nov 25 15:39:21 crc kubenswrapper[4704]: I1125 15:39:21.596823 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99a5c5d6302287fc9aa215cdbf1c2ca86fb5a656d2493e049f1c4cadfd816ad1"} err="failed to get container status \"99a5c5d6302287fc9aa215cdbf1c2ca86fb5a656d2493e049f1c4cadfd816ad1\": rpc error: code = NotFound desc = could not find container \"99a5c5d6302287fc9aa215cdbf1c2ca86fb5a656d2493e049f1c4cadfd816ad1\": container with ID starting with 99a5c5d6302287fc9aa215cdbf1c2ca86fb5a656d2493e049f1c4cadfd816ad1 not found: ID does not exist" Nov 25 15:39:21 crc kubenswrapper[4704]: I1125 15:39:21.596851 4704 scope.go:117] "RemoveContainer" containerID="5c4d31000afa52a372375f02fcede828c3bdf9308eccb312ed8599f35d42f098" Nov 25 15:39:21 crc kubenswrapper[4704]: E1125 15:39:21.597293 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c4d31000afa52a372375f02fcede828c3bdf9308eccb312ed8599f35d42f098\": container with ID starting with 5c4d31000afa52a372375f02fcede828c3bdf9308eccb312ed8599f35d42f098 not found: ID does not exist" containerID="5c4d31000afa52a372375f02fcede828c3bdf9308eccb312ed8599f35d42f098" Nov 25 15:39:21 crc kubenswrapper[4704]: I1125 15:39:21.597482 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c4d31000afa52a372375f02fcede828c3bdf9308eccb312ed8599f35d42f098"} err="failed to get container status \"5c4d31000afa52a372375f02fcede828c3bdf9308eccb312ed8599f35d42f098\": rpc error: code = NotFound desc = could not find container \"5c4d31000afa52a372375f02fcede828c3bdf9308eccb312ed8599f35d42f098\": container with ID starting with 5c4d31000afa52a372375f02fcede828c3bdf9308eccb312ed8599f35d42f098 not found: ID does not exist" Nov 25 15:39:21 crc kubenswrapper[4704]: I1125 15:39:21.597519 4704 scope.go:117] "RemoveContainer" containerID="d8986572938f73df99c95558ffc0e1b9eb720c7ea80f731b2825b691d5c53b50" Nov 25 15:39:21 crc kubenswrapper[4704]: E1125 15:39:21.598757 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8986572938f73df99c95558ffc0e1b9eb720c7ea80f731b2825b691d5c53b50\": container with ID starting with d8986572938f73df99c95558ffc0e1b9eb720c7ea80f731b2825b691d5c53b50 not found: ID does not exist" containerID="d8986572938f73df99c95558ffc0e1b9eb720c7ea80f731b2825b691d5c53b50" Nov 25 15:39:21 crc kubenswrapper[4704]: I1125 15:39:21.598820 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8986572938f73df99c95558ffc0e1b9eb720c7ea80f731b2825b691d5c53b50"} err="failed to get container status \"d8986572938f73df99c95558ffc0e1b9eb720c7ea80f731b2825b691d5c53b50\": rpc error: code = NotFound desc = could not find container \"d8986572938f73df99c95558ffc0e1b9eb720c7ea80f731b2825b691d5c53b50\": container with ID starting with d8986572938f73df99c95558ffc0e1b9eb720c7ea80f731b2825b691d5c53b50 not found: ID does not exist" Nov 25 15:39:22 crc kubenswrapper[4704]: I1125 15:39:22.436550 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e025f26-2a87-46e7-a152-84793272fb4b" path="/var/lib/kubelet/pods/3e025f26-2a87-46e7-a152-84793272fb4b/volumes" Nov 25 15:39:22 crc kubenswrapper[4704]: I1125 15:39:22.437266 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5448c847-73a1-4fdc-8d52-d0b0f4ea5129" path="/var/lib/kubelet/pods/5448c847-73a1-4fdc-8d52-d0b0f4ea5129/volumes" Nov 25 15:39:22 crc kubenswrapper[4704]: I1125 15:39:22.438057 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72d532a8-8d23-4796-a9eb-80a3aeb3acae" path="/var/lib/kubelet/pods/72d532a8-8d23-4796-a9eb-80a3aeb3acae/volumes" Nov 25 15:39:23 crc kubenswrapper[4704]: I1125 15:39:23.404505 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d57hr"] Nov 25 15:39:23 crc kubenswrapper[4704]: I1125 15:39:23.405269 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d57hr" podUID="babe0f08-98f5-4fde-827a-148857d14ebe" containerName="registry-server" containerID="cri-o://ba09015ba8c7d8a74a422af49cd1b9b24702456343aa7130512a223e529f593a" gracePeriod=2 Nov 25 15:39:24 crc kubenswrapper[4704]: I1125 15:39:24.546746 4704 generic.go:334] "Generic (PLEG): container finished" podID="babe0f08-98f5-4fde-827a-148857d14ebe" containerID="ba09015ba8c7d8a74a422af49cd1b9b24702456343aa7130512a223e529f593a" exitCode=0 Nov 25 15:39:24 crc kubenswrapper[4704]: I1125 15:39:24.546813 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d57hr" event={"ID":"babe0f08-98f5-4fde-827a-148857d14ebe","Type":"ContainerDied","Data":"ba09015ba8c7d8a74a422af49cd1b9b24702456343aa7130512a223e529f593a"} Nov 25 15:39:25 crc kubenswrapper[4704]: I1125 15:39:25.555619 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d57hr" event={"ID":"babe0f08-98f5-4fde-827a-148857d14ebe","Type":"ContainerDied","Data":"25fa03f23a85fc99ba71f7def1d9c90c39980c91b554a9b1a25340dbe125f1d5"} Nov 25 15:39:25 crc kubenswrapper[4704]: I1125 15:39:25.556071 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25fa03f23a85fc99ba71f7def1d9c90c39980c91b554a9b1a25340dbe125f1d5" Nov 25 15:39:25 crc kubenswrapper[4704]: I1125 15:39:25.574333 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d57hr" Nov 25 15:39:25 crc kubenswrapper[4704]: I1125 15:39:25.745618 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/babe0f08-98f5-4fde-827a-148857d14ebe-utilities\") pod \"babe0f08-98f5-4fde-827a-148857d14ebe\" (UID: \"babe0f08-98f5-4fde-827a-148857d14ebe\") " Nov 25 15:39:25 crc kubenswrapper[4704]: I1125 15:39:25.746980 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vntgh\" (UniqueName: \"kubernetes.io/projected/babe0f08-98f5-4fde-827a-148857d14ebe-kube-api-access-vntgh\") pod \"babe0f08-98f5-4fde-827a-148857d14ebe\" (UID: \"babe0f08-98f5-4fde-827a-148857d14ebe\") " Nov 25 15:39:25 crc kubenswrapper[4704]: I1125 15:39:25.747023 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/babe0f08-98f5-4fde-827a-148857d14ebe-catalog-content\") pod \"babe0f08-98f5-4fde-827a-148857d14ebe\" (UID: \"babe0f08-98f5-4fde-827a-148857d14ebe\") " Nov 25 15:39:25 crc kubenswrapper[4704]: I1125 15:39:25.746897 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/babe0f08-98f5-4fde-827a-148857d14ebe-utilities" (OuterVolumeSpecName: "utilities") pod "babe0f08-98f5-4fde-827a-148857d14ebe" (UID: "babe0f08-98f5-4fde-827a-148857d14ebe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:39:25 crc kubenswrapper[4704]: I1125 15:39:25.755455 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/babe0f08-98f5-4fde-827a-148857d14ebe-kube-api-access-vntgh" (OuterVolumeSpecName: "kube-api-access-vntgh") pod "babe0f08-98f5-4fde-827a-148857d14ebe" (UID: "babe0f08-98f5-4fde-827a-148857d14ebe"). InnerVolumeSpecName "kube-api-access-vntgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:39:25 crc kubenswrapper[4704]: I1125 15:39:25.848396 4704 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/babe0f08-98f5-4fde-827a-148857d14ebe-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:39:25 crc kubenswrapper[4704]: I1125 15:39:25.848444 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vntgh\" (UniqueName: \"kubernetes.io/projected/babe0f08-98f5-4fde-827a-148857d14ebe-kube-api-access-vntgh\") on node \"crc\" DevicePath \"\"" Nov 25 15:39:26 crc kubenswrapper[4704]: I1125 15:39:26.560704 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d57hr" Nov 25 15:39:26 crc kubenswrapper[4704]: I1125 15:39:26.699282 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/babe0f08-98f5-4fde-827a-148857d14ebe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "babe0f08-98f5-4fde-827a-148857d14ebe" (UID: "babe0f08-98f5-4fde-827a-148857d14ebe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:39:26 crc kubenswrapper[4704]: I1125 15:39:26.758981 4704 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/babe0f08-98f5-4fde-827a-148857d14ebe-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:39:26 crc kubenswrapper[4704]: I1125 15:39:26.894645 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d57hr"] Nov 25 15:39:26 crc kubenswrapper[4704]: I1125 15:39:26.902219 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d57hr"] Nov 25 15:39:28 crc kubenswrapper[4704]: I1125 15:39:28.424377 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="babe0f08-98f5-4fde-827a-148857d14ebe" path="/var/lib/kubelet/pods/babe0f08-98f5-4fde-827a-148857d14ebe/volumes" Nov 25 15:39:34 crc kubenswrapper[4704]: I1125 15:39:34.759686 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5llzt"] Nov 25 15:39:59 crc kubenswrapper[4704]: I1125 15:39:59.788534 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" podUID="17320021-32dc-4bef-befa-fa0a7c2b8533" containerName="oauth-openshift" containerID="cri-o://97cc3b1151fc8f3b04bfe716a3f0164ec2ee6ec3f7324322e2d4db66d4f160fe" gracePeriod=15 Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.202983 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.251877 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-75566f9bd7-zgdvw"] Nov 25 15:40:00 crc kubenswrapper[4704]: E1125 15:40:00.252303 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="473f9f98-2337-4ceb-a91f-6fe3cd2dffc2" containerName="pruner" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.252318 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="473f9f98-2337-4ceb-a91f-6fe3cd2dffc2" containerName="pruner" Nov 25 15:40:00 crc kubenswrapper[4704]: E1125 15:40:00.252330 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="babe0f08-98f5-4fde-827a-148857d14ebe" containerName="extract-utilities" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.252337 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="babe0f08-98f5-4fde-827a-148857d14ebe" containerName="extract-utilities" Nov 25 15:40:00 crc kubenswrapper[4704]: E1125 15:40:00.252345 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72d532a8-8d23-4796-a9eb-80a3aeb3acae" containerName="extract-utilities" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.252352 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="72d532a8-8d23-4796-a9eb-80a3aeb3acae" containerName="extract-utilities" Nov 25 15:40:00 crc kubenswrapper[4704]: E1125 15:40:00.252359 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17320021-32dc-4bef-befa-fa0a7c2b8533" containerName="oauth-openshift" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.252365 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="17320021-32dc-4bef-befa-fa0a7c2b8533" containerName="oauth-openshift" Nov 25 15:40:00 crc kubenswrapper[4704]: E1125 15:40:00.252375 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e025f26-2a87-46e7-a152-84793272fb4b" containerName="extract-content" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.252381 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e025f26-2a87-46e7-a152-84793272fb4b" containerName="extract-content" Nov 25 15:40:00 crc kubenswrapper[4704]: E1125 15:40:00.252389 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38b9ead9-033f-44cb-9657-6a078bed2c0d" containerName="collect-profiles" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.252395 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="38b9ead9-033f-44cb-9657-6a078bed2c0d" containerName="collect-profiles" Nov 25 15:40:00 crc kubenswrapper[4704]: E1125 15:40:00.252404 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5448c847-73a1-4fdc-8d52-d0b0f4ea5129" containerName="extract-utilities" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.252412 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="5448c847-73a1-4fdc-8d52-d0b0f4ea5129" containerName="extract-utilities" Nov 25 15:40:00 crc kubenswrapper[4704]: E1125 15:40:00.252419 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5448c847-73a1-4fdc-8d52-d0b0f4ea5129" containerName="extract-content" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.252425 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="5448c847-73a1-4fdc-8d52-d0b0f4ea5129" containerName="extract-content" Nov 25 15:40:00 crc kubenswrapper[4704]: E1125 15:40:00.252431 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="babe0f08-98f5-4fde-827a-148857d14ebe" containerName="extract-content" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.252437 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="babe0f08-98f5-4fde-827a-148857d14ebe" containerName="extract-content" Nov 25 15:40:00 crc kubenswrapper[4704]: E1125 15:40:00.252446 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72d532a8-8d23-4796-a9eb-80a3aeb3acae" containerName="extract-content" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.252451 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="72d532a8-8d23-4796-a9eb-80a3aeb3acae" containerName="extract-content" Nov 25 15:40:00 crc kubenswrapper[4704]: E1125 15:40:00.252460 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="babe0f08-98f5-4fde-827a-148857d14ebe" containerName="registry-server" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.252466 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="babe0f08-98f5-4fde-827a-148857d14ebe" containerName="registry-server" Nov 25 15:40:00 crc kubenswrapper[4704]: E1125 15:40:00.252474 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72d532a8-8d23-4796-a9eb-80a3aeb3acae" containerName="registry-server" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.252480 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="72d532a8-8d23-4796-a9eb-80a3aeb3acae" containerName="registry-server" Nov 25 15:40:00 crc kubenswrapper[4704]: E1125 15:40:00.252489 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e206ba36-f94d-46da-af87-c89ea875f4c5" containerName="pruner" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.252495 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="e206ba36-f94d-46da-af87-c89ea875f4c5" containerName="pruner" Nov 25 15:40:00 crc kubenswrapper[4704]: E1125 15:40:00.252505 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e025f26-2a87-46e7-a152-84793272fb4b" containerName="registry-server" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.252510 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e025f26-2a87-46e7-a152-84793272fb4b" containerName="registry-server" Nov 25 15:40:00 crc kubenswrapper[4704]: E1125 15:40:00.252519 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e025f26-2a87-46e7-a152-84793272fb4b" containerName="extract-utilities" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.252525 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e025f26-2a87-46e7-a152-84793272fb4b" containerName="extract-utilities" Nov 25 15:40:00 crc kubenswrapper[4704]: E1125 15:40:00.252551 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5448c847-73a1-4fdc-8d52-d0b0f4ea5129" containerName="registry-server" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.252559 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="5448c847-73a1-4fdc-8d52-d0b0f4ea5129" containerName="registry-server" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.252660 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="babe0f08-98f5-4fde-827a-148857d14ebe" containerName="registry-server" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.252672 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="17320021-32dc-4bef-befa-fa0a7c2b8533" containerName="oauth-openshift" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.252679 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e025f26-2a87-46e7-a152-84793272fb4b" containerName="registry-server" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.252687 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="473f9f98-2337-4ceb-a91f-6fe3cd2dffc2" containerName="pruner" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.252696 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="5448c847-73a1-4fdc-8d52-d0b0f4ea5129" containerName="registry-server" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.252704 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="72d532a8-8d23-4796-a9eb-80a3aeb3acae" containerName="registry-server" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.252711 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="38b9ead9-033f-44cb-9657-6a078bed2c0d" containerName="collect-profiles" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.252716 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="e206ba36-f94d-46da-af87-c89ea875f4c5" containerName="pruner" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.253179 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.254680 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-75566f9bd7-zgdvw"] Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.317828 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/17320021-32dc-4bef-befa-fa0a7c2b8533-audit-dir\") pod \"17320021-32dc-4bef-befa-fa0a7c2b8533\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.317917 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-service-ca\") pod \"17320021-32dc-4bef-befa-fa0a7c2b8533\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.317945 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-user-template-error\") pod \"17320021-32dc-4bef-befa-fa0a7c2b8533\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.317963 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-cliconfig\") pod \"17320021-32dc-4bef-befa-fa0a7c2b8533\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.318005 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-session\") pod \"17320021-32dc-4bef-befa-fa0a7c2b8533\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.318093 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-serving-cert\") pod \"17320021-32dc-4bef-befa-fa0a7c2b8533\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.318113 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/17320021-32dc-4bef-befa-fa0a7c2b8533-audit-policies\") pod \"17320021-32dc-4bef-befa-fa0a7c2b8533\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.318138 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-user-template-provider-selection\") pod \"17320021-32dc-4bef-befa-fa0a7c2b8533\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.318157 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-trusted-ca-bundle\") pod \"17320021-32dc-4bef-befa-fa0a7c2b8533\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.318178 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-user-template-login\") pod \"17320021-32dc-4bef-befa-fa0a7c2b8533\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.318207 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-user-idp-0-file-data\") pod \"17320021-32dc-4bef-befa-fa0a7c2b8533\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.318226 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-router-certs\") pod \"17320021-32dc-4bef-befa-fa0a7c2b8533\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.318252 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-ocp-branding-template\") pod \"17320021-32dc-4bef-befa-fa0a7c2b8533\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.318280 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2n8z2\" (UniqueName: \"kubernetes.io/projected/17320021-32dc-4bef-befa-fa0a7c2b8533-kube-api-access-2n8z2\") pod \"17320021-32dc-4bef-befa-fa0a7c2b8533\" (UID: \"17320021-32dc-4bef-befa-fa0a7c2b8533\") " Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.318914 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17320021-32dc-4bef-befa-fa0a7c2b8533-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "17320021-32dc-4bef-befa-fa0a7c2b8533" (UID: "17320021-32dc-4bef-befa-fa0a7c2b8533"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.319186 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "17320021-32dc-4bef-befa-fa0a7c2b8533" (UID: "17320021-32dc-4bef-befa-fa0a7c2b8533"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.319765 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "17320021-32dc-4bef-befa-fa0a7c2b8533" (UID: "17320021-32dc-4bef-befa-fa0a7c2b8533"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.320097 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17320021-32dc-4bef-befa-fa0a7c2b8533-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "17320021-32dc-4bef-befa-fa0a7c2b8533" (UID: "17320021-32dc-4bef-befa-fa0a7c2b8533"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.320320 4704 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/17320021-32dc-4bef-befa-fa0a7c2b8533-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.320814 4704 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.320946 4704 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.320895 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "17320021-32dc-4bef-befa-fa0a7c2b8533" (UID: "17320021-32dc-4bef-befa-fa0a7c2b8533"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.326341 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "17320021-32dc-4bef-befa-fa0a7c2b8533" (UID: "17320021-32dc-4bef-befa-fa0a7c2b8533"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.326729 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "17320021-32dc-4bef-befa-fa0a7c2b8533" (UID: "17320021-32dc-4bef-befa-fa0a7c2b8533"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.327214 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "17320021-32dc-4bef-befa-fa0a7c2b8533" (UID: "17320021-32dc-4bef-befa-fa0a7c2b8533"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.339137 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "17320021-32dc-4bef-befa-fa0a7c2b8533" (UID: "17320021-32dc-4bef-befa-fa0a7c2b8533"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.339350 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17320021-32dc-4bef-befa-fa0a7c2b8533-kube-api-access-2n8z2" (OuterVolumeSpecName: "kube-api-access-2n8z2") pod "17320021-32dc-4bef-befa-fa0a7c2b8533" (UID: "17320021-32dc-4bef-befa-fa0a7c2b8533"). InnerVolumeSpecName "kube-api-access-2n8z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.339665 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "17320021-32dc-4bef-befa-fa0a7c2b8533" (UID: "17320021-32dc-4bef-befa-fa0a7c2b8533"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.340093 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "17320021-32dc-4bef-befa-fa0a7c2b8533" (UID: "17320021-32dc-4bef-befa-fa0a7c2b8533"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.340361 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "17320021-32dc-4bef-befa-fa0a7c2b8533" (UID: "17320021-32dc-4bef-befa-fa0a7c2b8533"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.340655 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "17320021-32dc-4bef-befa-fa0a7c2b8533" (UID: "17320021-32dc-4bef-befa-fa0a7c2b8533"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.424320 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-system-session\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.424405 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.424447 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-system-service-ca\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.424633 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-audit-policies\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.424714 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-system-router-certs\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.425010 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.425105 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-user-template-error\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.425171 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.425243 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-user-template-login\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.425290 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.425651 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.426025 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6mch\" (UniqueName: \"kubernetes.io/projected/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-kube-api-access-v6mch\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.426081 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-audit-dir\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.426118 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.426368 4704 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.426400 4704 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/17320021-32dc-4bef-befa-fa0a7c2b8533-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.426425 4704 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.426447 4704 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.426469 4704 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.426492 4704 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.426514 4704 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.426537 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2n8z2\" (UniqueName: \"kubernetes.io/projected/17320021-32dc-4bef-befa-fa0a7c2b8533-kube-api-access-2n8z2\") on node \"crc\" DevicePath \"\"" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.426558 4704 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.426581 4704 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.426601 4704 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/17320021-32dc-4bef-befa-fa0a7c2b8533-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.527711 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-system-router-certs\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.527811 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.527844 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-user-template-error\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.527885 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.527919 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-user-template-login\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.527948 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.527978 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.528021 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6mch\" (UniqueName: \"kubernetes.io/projected/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-kube-api-access-v6mch\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.528045 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.528067 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-audit-dir\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.528111 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-system-session\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.528143 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.528166 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-audit-policies\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.528189 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-system-service-ca\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.528236 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-audit-dir\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.529029 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.529035 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-system-service-ca\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.530022 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.530626 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-audit-policies\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.531903 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-system-router-certs\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.532432 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.533385 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.533373 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-user-template-error\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.533674 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.534106 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-user-template-login\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.535840 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.536855 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-v4-0-config-system-session\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.549514 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6mch\" (UniqueName: \"kubernetes.io/projected/2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3-kube-api-access-v6mch\") pod \"oauth-openshift-75566f9bd7-zgdvw\" (UID: \"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.577117 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.744322 4704 generic.go:334] "Generic (PLEG): container finished" podID="17320021-32dc-4bef-befa-fa0a7c2b8533" containerID="97cc3b1151fc8f3b04bfe716a3f0164ec2ee6ec3f7324322e2d4db66d4f160fe" exitCode=0 Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.744425 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.744452 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" event={"ID":"17320021-32dc-4bef-befa-fa0a7c2b8533","Type":"ContainerDied","Data":"97cc3b1151fc8f3b04bfe716a3f0164ec2ee6ec3f7324322e2d4db66d4f160fe"} Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.745663 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5llzt" event={"ID":"17320021-32dc-4bef-befa-fa0a7c2b8533","Type":"ContainerDied","Data":"2c0a2e3c359b50e830c05cbdeb911b9ea3ca746287e3507b7a140088d0db535a"} Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.745724 4704 scope.go:117] "RemoveContainer" containerID="97cc3b1151fc8f3b04bfe716a3f0164ec2ee6ec3f7324322e2d4db66d4f160fe" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.781192 4704 scope.go:117] "RemoveContainer" containerID="97cc3b1151fc8f3b04bfe716a3f0164ec2ee6ec3f7324322e2d4db66d4f160fe" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.783737 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5llzt"] Nov 25 15:40:00 crc kubenswrapper[4704]: E1125 15:40:00.783765 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97cc3b1151fc8f3b04bfe716a3f0164ec2ee6ec3f7324322e2d4db66d4f160fe\": container with ID starting with 97cc3b1151fc8f3b04bfe716a3f0164ec2ee6ec3f7324322e2d4db66d4f160fe not found: ID does not exist" containerID="97cc3b1151fc8f3b04bfe716a3f0164ec2ee6ec3f7324322e2d4db66d4f160fe" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.783901 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97cc3b1151fc8f3b04bfe716a3f0164ec2ee6ec3f7324322e2d4db66d4f160fe"} err="failed to get container status \"97cc3b1151fc8f3b04bfe716a3f0164ec2ee6ec3f7324322e2d4db66d4f160fe\": rpc error: code = NotFound desc = could not find container \"97cc3b1151fc8f3b04bfe716a3f0164ec2ee6ec3f7324322e2d4db66d4f160fe\": container with ID starting with 97cc3b1151fc8f3b04bfe716a3f0164ec2ee6ec3f7324322e2d4db66d4f160fe not found: ID does not exist" Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.788165 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5llzt"] Nov 25 15:40:00 crc kubenswrapper[4704]: I1125 15:40:00.978989 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-75566f9bd7-zgdvw"] Nov 25 15:40:01 crc kubenswrapper[4704]: I1125 15:40:01.752404 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" event={"ID":"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3","Type":"ContainerStarted","Data":"71fc5c7db6021ba7f0c184de6d0aa790aee8cd20802b59fedb29bbe31d9736fa"} Nov 25 15:40:01 crc kubenswrapper[4704]: I1125 15:40:01.752941 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" event={"ID":"2d7913a9-ce60-4789-bc2b-b4eb9e2d97b3","Type":"ContainerStarted","Data":"9b1fc9a0b0352e0c2837a3d210d47877f3532599b9481b995c9981694ffabeb6"} Nov 25 15:40:01 crc kubenswrapper[4704]: I1125 15:40:01.753839 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:01 crc kubenswrapper[4704]: I1125 15:40:01.759081 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" Nov 25 15:40:01 crc kubenswrapper[4704]: I1125 15:40:01.778269 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-75566f9bd7-zgdvw" podStartSLOduration=27.778245521 podStartE2EDuration="27.778245521s" podCreationTimestamp="2025-11-25 15:39:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:40:01.773009126 +0000 UTC m=+288.041282917" watchObservedRunningTime="2025-11-25 15:40:01.778245521 +0000 UTC m=+288.046519302" Nov 25 15:40:02 crc kubenswrapper[4704]: I1125 15:40:02.423668 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17320021-32dc-4bef-befa-fa0a7c2b8533" path="/var/lib/kubelet/pods/17320021-32dc-4bef-befa-fa0a7c2b8533/volumes" Nov 25 15:40:14 crc kubenswrapper[4704]: I1125 15:40:14.240183 4704 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Nov 25 15:40:27 crc kubenswrapper[4704]: I1125 15:40:27.541708 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mx4sl"] Nov 25 15:40:27 crc kubenswrapper[4704]: I1125 15:40:27.546123 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mx4sl" podUID="260ec6a9-8914-49dc-8cd8-95c8fa30a29a" containerName="registry-server" containerID="cri-o://c0c1b8a4b3ec9058a4f9588721d8dd59ee55306bd1432a95d00bc52e270a5e2b" gracePeriod=30 Nov 25 15:40:27 crc kubenswrapper[4704]: I1125 15:40:27.554231 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-smlnk"] Nov 25 15:40:27 crc kubenswrapper[4704]: I1125 15:40:27.556450 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8hvj9"] Nov 25 15:40:27 crc kubenswrapper[4704]: I1125 15:40:27.556966 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-smlnk" podUID="09422116-4570-4f3b-bde3-aaebdb318c47" containerName="registry-server" containerID="cri-o://a29ffc387d11e823ecaa7ff86ac49234cae3877c97949819c9b7fb1e9fb03c08" gracePeriod=30 Nov 25 15:40:27 crc kubenswrapper[4704]: I1125 15:40:27.560315 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-8hvj9" podUID="6193bcc6-1da4-414c-84df-92b1bead0762" containerName="marketplace-operator" containerID="cri-o://1a6d289ac1aa1275bf04bb68f32c952870c145ca1dff24f829e7be3e291711e4" gracePeriod=30 Nov 25 15:40:27 crc kubenswrapper[4704]: I1125 15:40:27.570134 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mffhk"] Nov 25 15:40:27 crc kubenswrapper[4704]: I1125 15:40:27.570489 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mffhk" podUID="36f52ec1-c7de-4345-bc4a-4d8fc6f182fe" containerName="registry-server" containerID="cri-o://ab07af679b445dbd637a942918390ee20861a7ec64c1124f07af2dc98b273864" gracePeriod=30 Nov 25 15:40:27 crc kubenswrapper[4704]: I1125 15:40:27.578075 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-crs88"] Nov 25 15:40:27 crc kubenswrapper[4704]: I1125 15:40:27.580305 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-crs88" Nov 25 15:40:27 crc kubenswrapper[4704]: I1125 15:40:27.583993 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-94qft"] Nov 25 15:40:27 crc kubenswrapper[4704]: I1125 15:40:27.584426 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-94qft" podUID="cdb3b0c5-af6c-4d36-bba3-5a8419a72107" containerName="registry-server" containerID="cri-o://778b173d28098e0b96b3b2c5e7567794f5752393c66df47e0144e1c225f5f314" gracePeriod=30 Nov 25 15:40:27 crc kubenswrapper[4704]: I1125 15:40:27.593442 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-crs88"] Nov 25 15:40:27 crc kubenswrapper[4704]: I1125 15:40:27.688447 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8cr9\" (UniqueName: \"kubernetes.io/projected/54e9da8e-917b-4a46-9fe9-725f950fced1-kube-api-access-f8cr9\") pod \"marketplace-operator-79b997595-crs88\" (UID: \"54e9da8e-917b-4a46-9fe9-725f950fced1\") " pod="openshift-marketplace/marketplace-operator-79b997595-crs88" Nov 25 15:40:27 crc kubenswrapper[4704]: I1125 15:40:27.688520 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/54e9da8e-917b-4a46-9fe9-725f950fced1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-crs88\" (UID: \"54e9da8e-917b-4a46-9fe9-725f950fced1\") " pod="openshift-marketplace/marketplace-operator-79b997595-crs88" Nov 25 15:40:27 crc kubenswrapper[4704]: I1125 15:40:27.688569 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54e9da8e-917b-4a46-9fe9-725f950fced1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-crs88\" (UID: \"54e9da8e-917b-4a46-9fe9-725f950fced1\") " pod="openshift-marketplace/marketplace-operator-79b997595-crs88" Nov 25 15:40:27 crc kubenswrapper[4704]: I1125 15:40:27.790011 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54e9da8e-917b-4a46-9fe9-725f950fced1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-crs88\" (UID: \"54e9da8e-917b-4a46-9fe9-725f950fced1\") " pod="openshift-marketplace/marketplace-operator-79b997595-crs88" Nov 25 15:40:27 crc kubenswrapper[4704]: I1125 15:40:27.790137 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8cr9\" (UniqueName: \"kubernetes.io/projected/54e9da8e-917b-4a46-9fe9-725f950fced1-kube-api-access-f8cr9\") pod \"marketplace-operator-79b997595-crs88\" (UID: \"54e9da8e-917b-4a46-9fe9-725f950fced1\") " pod="openshift-marketplace/marketplace-operator-79b997595-crs88" Nov 25 15:40:27 crc kubenswrapper[4704]: I1125 15:40:27.790191 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/54e9da8e-917b-4a46-9fe9-725f950fced1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-crs88\" (UID: \"54e9da8e-917b-4a46-9fe9-725f950fced1\") " pod="openshift-marketplace/marketplace-operator-79b997595-crs88" Nov 25 15:40:27 crc kubenswrapper[4704]: I1125 15:40:27.791782 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54e9da8e-917b-4a46-9fe9-725f950fced1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-crs88\" (UID: \"54e9da8e-917b-4a46-9fe9-725f950fced1\") " pod="openshift-marketplace/marketplace-operator-79b997595-crs88" Nov 25 15:40:27 crc kubenswrapper[4704]: I1125 15:40:27.804716 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/54e9da8e-917b-4a46-9fe9-725f950fced1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-crs88\" (UID: \"54e9da8e-917b-4a46-9fe9-725f950fced1\") " pod="openshift-marketplace/marketplace-operator-79b997595-crs88" Nov 25 15:40:27 crc kubenswrapper[4704]: I1125 15:40:27.809256 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8cr9\" (UniqueName: \"kubernetes.io/projected/54e9da8e-917b-4a46-9fe9-725f950fced1-kube-api-access-f8cr9\") pod \"marketplace-operator-79b997595-crs88\" (UID: \"54e9da8e-917b-4a46-9fe9-725f950fced1\") " pod="openshift-marketplace/marketplace-operator-79b997595-crs88" Nov 25 15:40:27 crc kubenswrapper[4704]: I1125 15:40:27.914926 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-crs88" Nov 25 15:40:27 crc kubenswrapper[4704]: E1125 15:40:27.933033 4704 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ab07af679b445dbd637a942918390ee20861a7ec64c1124f07af2dc98b273864 is running failed: container process not found" containerID="ab07af679b445dbd637a942918390ee20861a7ec64c1124f07af2dc98b273864" cmd=["grpc_health_probe","-addr=:50051"] Nov 25 15:40:27 crc kubenswrapper[4704]: E1125 15:40:27.934204 4704 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ab07af679b445dbd637a942918390ee20861a7ec64c1124f07af2dc98b273864 is running failed: container process not found" containerID="ab07af679b445dbd637a942918390ee20861a7ec64c1124f07af2dc98b273864" cmd=["grpc_health_probe","-addr=:50051"] Nov 25 15:40:27 crc kubenswrapper[4704]: E1125 15:40:27.934727 4704 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ab07af679b445dbd637a942918390ee20861a7ec64c1124f07af2dc98b273864 is running failed: container process not found" containerID="ab07af679b445dbd637a942918390ee20861a7ec64c1124f07af2dc98b273864" cmd=["grpc_health_probe","-addr=:50051"] Nov 25 15:40:27 crc kubenswrapper[4704]: E1125 15:40:27.934760 4704 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ab07af679b445dbd637a942918390ee20861a7ec64c1124f07af2dc98b273864 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-mffhk" podUID="36f52ec1-c7de-4345-bc4a-4d8fc6f182fe" containerName="registry-server" Nov 25 15:40:27 crc kubenswrapper[4704]: I1125 15:40:27.982299 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8hvj9" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.024005 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-94qft" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.035171 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smlnk" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.043566 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mffhk" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.095665 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09422116-4570-4f3b-bde3-aaebdb318c47-catalog-content\") pod \"09422116-4570-4f3b-bde3-aaebdb318c47\" (UID: \"09422116-4570-4f3b-bde3-aaebdb318c47\") " Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.095707 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09422116-4570-4f3b-bde3-aaebdb318c47-utilities\") pod \"09422116-4570-4f3b-bde3-aaebdb318c47\" (UID: \"09422116-4570-4f3b-bde3-aaebdb318c47\") " Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.095737 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgrvk\" (UniqueName: \"kubernetes.io/projected/cdb3b0c5-af6c-4d36-bba3-5a8419a72107-kube-api-access-jgrvk\") pod \"cdb3b0c5-af6c-4d36-bba3-5a8419a72107\" (UID: \"cdb3b0c5-af6c-4d36-bba3-5a8419a72107\") " Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.095801 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6193bcc6-1da4-414c-84df-92b1bead0762-marketplace-operator-metrics\") pod \"6193bcc6-1da4-414c-84df-92b1bead0762\" (UID: \"6193bcc6-1da4-414c-84df-92b1bead0762\") " Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.095844 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36f52ec1-c7de-4345-bc4a-4d8fc6f182fe-catalog-content\") pod \"36f52ec1-c7de-4345-bc4a-4d8fc6f182fe\" (UID: \"36f52ec1-c7de-4345-bc4a-4d8fc6f182fe\") " Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.095890 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfmss\" (UniqueName: \"kubernetes.io/projected/6193bcc6-1da4-414c-84df-92b1bead0762-kube-api-access-hfmss\") pod \"6193bcc6-1da4-414c-84df-92b1bead0762\" (UID: \"6193bcc6-1da4-414c-84df-92b1bead0762\") " Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.095915 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6193bcc6-1da4-414c-84df-92b1bead0762-marketplace-trusted-ca\") pod \"6193bcc6-1da4-414c-84df-92b1bead0762\" (UID: \"6193bcc6-1da4-414c-84df-92b1bead0762\") " Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.095937 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdb3b0c5-af6c-4d36-bba3-5a8419a72107-utilities\") pod \"cdb3b0c5-af6c-4d36-bba3-5a8419a72107\" (UID: \"cdb3b0c5-af6c-4d36-bba3-5a8419a72107\") " Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.095958 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36f52ec1-c7de-4345-bc4a-4d8fc6f182fe-utilities\") pod \"36f52ec1-c7de-4345-bc4a-4d8fc6f182fe\" (UID: \"36f52ec1-c7de-4345-bc4a-4d8fc6f182fe\") " Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.095973 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdb3b0c5-af6c-4d36-bba3-5a8419a72107-catalog-content\") pod \"cdb3b0c5-af6c-4d36-bba3-5a8419a72107\" (UID: \"cdb3b0c5-af6c-4d36-bba3-5a8419a72107\") " Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.096003 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h6r6\" (UniqueName: \"kubernetes.io/projected/36f52ec1-c7de-4345-bc4a-4d8fc6f182fe-kube-api-access-5h6r6\") pod \"36f52ec1-c7de-4345-bc4a-4d8fc6f182fe\" (UID: \"36f52ec1-c7de-4345-bc4a-4d8fc6f182fe\") " Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.096039 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvc5r\" (UniqueName: \"kubernetes.io/projected/09422116-4570-4f3b-bde3-aaebdb318c47-kube-api-access-jvc5r\") pod \"09422116-4570-4f3b-bde3-aaebdb318c47\" (UID: \"09422116-4570-4f3b-bde3-aaebdb318c47\") " Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.097606 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6193bcc6-1da4-414c-84df-92b1bead0762-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "6193bcc6-1da4-414c-84df-92b1bead0762" (UID: "6193bcc6-1da4-414c-84df-92b1bead0762"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.097625 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09422116-4570-4f3b-bde3-aaebdb318c47-utilities" (OuterVolumeSpecName: "utilities") pod "09422116-4570-4f3b-bde3-aaebdb318c47" (UID: "09422116-4570-4f3b-bde3-aaebdb318c47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.098029 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdb3b0c5-af6c-4d36-bba3-5a8419a72107-utilities" (OuterVolumeSpecName: "utilities") pod "cdb3b0c5-af6c-4d36-bba3-5a8419a72107" (UID: "cdb3b0c5-af6c-4d36-bba3-5a8419a72107"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.108105 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09422116-4570-4f3b-bde3-aaebdb318c47-kube-api-access-jvc5r" (OuterVolumeSpecName: "kube-api-access-jvc5r") pod "09422116-4570-4f3b-bde3-aaebdb318c47" (UID: "09422116-4570-4f3b-bde3-aaebdb318c47"). InnerVolumeSpecName "kube-api-access-jvc5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.108254 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36f52ec1-c7de-4345-bc4a-4d8fc6f182fe-kube-api-access-5h6r6" (OuterVolumeSpecName: "kube-api-access-5h6r6") pod "36f52ec1-c7de-4345-bc4a-4d8fc6f182fe" (UID: "36f52ec1-c7de-4345-bc4a-4d8fc6f182fe"). InnerVolumeSpecName "kube-api-access-5h6r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.108316 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6193bcc6-1da4-414c-84df-92b1bead0762-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "6193bcc6-1da4-414c-84df-92b1bead0762" (UID: "6193bcc6-1da4-414c-84df-92b1bead0762"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.108418 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6193bcc6-1da4-414c-84df-92b1bead0762-kube-api-access-hfmss" (OuterVolumeSpecName: "kube-api-access-hfmss") pod "6193bcc6-1da4-414c-84df-92b1bead0762" (UID: "6193bcc6-1da4-414c-84df-92b1bead0762"). InnerVolumeSpecName "kube-api-access-hfmss". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.112175 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdb3b0c5-af6c-4d36-bba3-5a8419a72107-kube-api-access-jgrvk" (OuterVolumeSpecName: "kube-api-access-jgrvk") pod "cdb3b0c5-af6c-4d36-bba3-5a8419a72107" (UID: "cdb3b0c5-af6c-4d36-bba3-5a8419a72107"). InnerVolumeSpecName "kube-api-access-jgrvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.121505 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36f52ec1-c7de-4345-bc4a-4d8fc6f182fe-utilities" (OuterVolumeSpecName: "utilities") pod "36f52ec1-c7de-4345-bc4a-4d8fc6f182fe" (UID: "36f52ec1-c7de-4345-bc4a-4d8fc6f182fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.124758 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36f52ec1-c7de-4345-bc4a-4d8fc6f182fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36f52ec1-c7de-4345-bc4a-4d8fc6f182fe" (UID: "36f52ec1-c7de-4345-bc4a-4d8fc6f182fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.166363 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09422116-4570-4f3b-bde3-aaebdb318c47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09422116-4570-4f3b-bde3-aaebdb318c47" (UID: "09422116-4570-4f3b-bde3-aaebdb318c47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.197980 4704 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36f52ec1-c7de-4345-bc4a-4d8fc6f182fe-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.198033 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5h6r6\" (UniqueName: \"kubernetes.io/projected/36f52ec1-c7de-4345-bc4a-4d8fc6f182fe-kube-api-access-5h6r6\") on node \"crc\" DevicePath \"\"" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.198047 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvc5r\" (UniqueName: \"kubernetes.io/projected/09422116-4570-4f3b-bde3-aaebdb318c47-kube-api-access-jvc5r\") on node \"crc\" DevicePath \"\"" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.198058 4704 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09422116-4570-4f3b-bde3-aaebdb318c47-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.198073 4704 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09422116-4570-4f3b-bde3-aaebdb318c47-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.198084 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgrvk\" (UniqueName: \"kubernetes.io/projected/cdb3b0c5-af6c-4d36-bba3-5a8419a72107-kube-api-access-jgrvk\") on node \"crc\" DevicePath \"\"" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.198096 4704 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6193bcc6-1da4-414c-84df-92b1bead0762-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.198109 4704 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36f52ec1-c7de-4345-bc4a-4d8fc6f182fe-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.198120 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfmss\" (UniqueName: \"kubernetes.io/projected/6193bcc6-1da4-414c-84df-92b1bead0762-kube-api-access-hfmss\") on node \"crc\" DevicePath \"\"" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.198171 4704 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6193bcc6-1da4-414c-84df-92b1bead0762-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.198179 4704 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdb3b0c5-af6c-4d36-bba3-5a8419a72107-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.202304 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdb3b0c5-af6c-4d36-bba3-5a8419a72107-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cdb3b0c5-af6c-4d36-bba3-5a8419a72107" (UID: "cdb3b0c5-af6c-4d36-bba3-5a8419a72107"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.220327 4704 generic.go:334] "Generic (PLEG): container finished" podID="09422116-4570-4f3b-bde3-aaebdb318c47" containerID="a29ffc387d11e823ecaa7ff86ac49234cae3877c97949819c9b7fb1e9fb03c08" exitCode=0 Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.220367 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smlnk" event={"ID":"09422116-4570-4f3b-bde3-aaebdb318c47","Type":"ContainerDied","Data":"a29ffc387d11e823ecaa7ff86ac49234cae3877c97949819c9b7fb1e9fb03c08"} Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.220396 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smlnk" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.220409 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smlnk" event={"ID":"09422116-4570-4f3b-bde3-aaebdb318c47","Type":"ContainerDied","Data":"66ba46f0bc70edc5a7c1f05cbc3a488ab73892eddf91b9da1c221bb5221bbc6a"} Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.220434 4704 scope.go:117] "RemoveContainer" containerID="a29ffc387d11e823ecaa7ff86ac49234cae3877c97949819c9b7fb1e9fb03c08" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.223196 4704 generic.go:334] "Generic (PLEG): container finished" podID="36f52ec1-c7de-4345-bc4a-4d8fc6f182fe" containerID="ab07af679b445dbd637a942918390ee20861a7ec64c1124f07af2dc98b273864" exitCode=0 Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.223256 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mffhk" event={"ID":"36f52ec1-c7de-4345-bc4a-4d8fc6f182fe","Type":"ContainerDied","Data":"ab07af679b445dbd637a942918390ee20861a7ec64c1124f07af2dc98b273864"} Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.223284 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mffhk" event={"ID":"36f52ec1-c7de-4345-bc4a-4d8fc6f182fe","Type":"ContainerDied","Data":"ceb9cd6a14d8e2df6c05f3a6ec775f76f19f65bcef90319f556dc7ee8638e9e9"} Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.223260 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mffhk" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.225470 4704 generic.go:334] "Generic (PLEG): container finished" podID="260ec6a9-8914-49dc-8cd8-95c8fa30a29a" containerID="c0c1b8a4b3ec9058a4f9588721d8dd59ee55306bd1432a95d00bc52e270a5e2b" exitCode=0 Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.225526 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mx4sl" event={"ID":"260ec6a9-8914-49dc-8cd8-95c8fa30a29a","Type":"ContainerDied","Data":"c0c1b8a4b3ec9058a4f9588721d8dd59ee55306bd1432a95d00bc52e270a5e2b"} Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.227359 4704 generic.go:334] "Generic (PLEG): container finished" podID="6193bcc6-1da4-414c-84df-92b1bead0762" containerID="1a6d289ac1aa1275bf04bb68f32c952870c145ca1dff24f829e7be3e291711e4" exitCode=0 Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.227420 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8hvj9" event={"ID":"6193bcc6-1da4-414c-84df-92b1bead0762","Type":"ContainerDied","Data":"1a6d289ac1aa1275bf04bb68f32c952870c145ca1dff24f829e7be3e291711e4"} Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.227445 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8hvj9" event={"ID":"6193bcc6-1da4-414c-84df-92b1bead0762","Type":"ContainerDied","Data":"cb3481ba0ac7a4226998041b61a6cd551caf3fe5d08ed44535a683ad1d2caf4f"} Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.227498 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8hvj9" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.238201 4704 generic.go:334] "Generic (PLEG): container finished" podID="cdb3b0c5-af6c-4d36-bba3-5a8419a72107" containerID="778b173d28098e0b96b3b2c5e7567794f5752393c66df47e0144e1c225f5f314" exitCode=0 Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.238303 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-94qft" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.238321 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-94qft" event={"ID":"cdb3b0c5-af6c-4d36-bba3-5a8419a72107","Type":"ContainerDied","Data":"778b173d28098e0b96b3b2c5e7567794f5752393c66df47e0144e1c225f5f314"} Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.238917 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-94qft" event={"ID":"cdb3b0c5-af6c-4d36-bba3-5a8419a72107","Type":"ContainerDied","Data":"25c53006bdf190422531f0dcc52b3e6310ea1e6788d085287fba4ab8708750a0"} Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.252942 4704 scope.go:117] "RemoveContainer" containerID="0f21fda7cd26867522c00030115638a01c7af6e7788125674036939b5bd0318b" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.268376 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mffhk"] Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.272958 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mffhk"] Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.280658 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8hvj9"] Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.287869 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8hvj9"] Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.294232 4704 scope.go:117] "RemoveContainer" containerID="4c3a582ab9bc65b171524f7888e926516ff7564dc738bc8dc33bf67258c8231d" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.299892 4704 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdb3b0c5-af6c-4d36-bba3-5a8419a72107-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.306041 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-smlnk"] Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.307670 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-smlnk"] Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.317541 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-94qft"] Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.319903 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-94qft"] Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.320209 4704 scope.go:117] "RemoveContainer" containerID="a29ffc387d11e823ecaa7ff86ac49234cae3877c97949819c9b7fb1e9fb03c08" Nov 25 15:40:28 crc kubenswrapper[4704]: E1125 15:40:28.320643 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a29ffc387d11e823ecaa7ff86ac49234cae3877c97949819c9b7fb1e9fb03c08\": container with ID starting with a29ffc387d11e823ecaa7ff86ac49234cae3877c97949819c9b7fb1e9fb03c08 not found: ID does not exist" containerID="a29ffc387d11e823ecaa7ff86ac49234cae3877c97949819c9b7fb1e9fb03c08" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.320672 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a29ffc387d11e823ecaa7ff86ac49234cae3877c97949819c9b7fb1e9fb03c08"} err="failed to get container status \"a29ffc387d11e823ecaa7ff86ac49234cae3877c97949819c9b7fb1e9fb03c08\": rpc error: code = NotFound desc = could not find container \"a29ffc387d11e823ecaa7ff86ac49234cae3877c97949819c9b7fb1e9fb03c08\": container with ID starting with a29ffc387d11e823ecaa7ff86ac49234cae3877c97949819c9b7fb1e9fb03c08 not found: ID does not exist" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.320695 4704 scope.go:117] "RemoveContainer" containerID="0f21fda7cd26867522c00030115638a01c7af6e7788125674036939b5bd0318b" Nov 25 15:40:28 crc kubenswrapper[4704]: E1125 15:40:28.321035 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f21fda7cd26867522c00030115638a01c7af6e7788125674036939b5bd0318b\": container with ID starting with 0f21fda7cd26867522c00030115638a01c7af6e7788125674036939b5bd0318b not found: ID does not exist" containerID="0f21fda7cd26867522c00030115638a01c7af6e7788125674036939b5bd0318b" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.321156 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f21fda7cd26867522c00030115638a01c7af6e7788125674036939b5bd0318b"} err="failed to get container status \"0f21fda7cd26867522c00030115638a01c7af6e7788125674036939b5bd0318b\": rpc error: code = NotFound desc = could not find container \"0f21fda7cd26867522c00030115638a01c7af6e7788125674036939b5bd0318b\": container with ID starting with 0f21fda7cd26867522c00030115638a01c7af6e7788125674036939b5bd0318b not found: ID does not exist" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.321192 4704 scope.go:117] "RemoveContainer" containerID="4c3a582ab9bc65b171524f7888e926516ff7564dc738bc8dc33bf67258c8231d" Nov 25 15:40:28 crc kubenswrapper[4704]: E1125 15:40:28.321565 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c3a582ab9bc65b171524f7888e926516ff7564dc738bc8dc33bf67258c8231d\": container with ID starting with 4c3a582ab9bc65b171524f7888e926516ff7564dc738bc8dc33bf67258c8231d not found: ID does not exist" containerID="4c3a582ab9bc65b171524f7888e926516ff7564dc738bc8dc33bf67258c8231d" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.321594 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c3a582ab9bc65b171524f7888e926516ff7564dc738bc8dc33bf67258c8231d"} err="failed to get container status \"4c3a582ab9bc65b171524f7888e926516ff7564dc738bc8dc33bf67258c8231d\": rpc error: code = NotFound desc = could not find container \"4c3a582ab9bc65b171524f7888e926516ff7564dc738bc8dc33bf67258c8231d\": container with ID starting with 4c3a582ab9bc65b171524f7888e926516ff7564dc738bc8dc33bf67258c8231d not found: ID does not exist" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.321608 4704 scope.go:117] "RemoveContainer" containerID="ab07af679b445dbd637a942918390ee20861a7ec64c1124f07af2dc98b273864" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.334136 4704 scope.go:117] "RemoveContainer" containerID="31f1678ff3eb9f0d68fbd3dd2ebc80d4aca53e6bb0a84252a68d482228d77fca" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.350524 4704 scope.go:117] "RemoveContainer" containerID="adb952b1edb22107b9d420a4dc1bd85e8c68d180c10c50a6deed2ee7490a95d7" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.361955 4704 scope.go:117] "RemoveContainer" containerID="ab07af679b445dbd637a942918390ee20861a7ec64c1124f07af2dc98b273864" Nov 25 15:40:28 crc kubenswrapper[4704]: E1125 15:40:28.362285 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab07af679b445dbd637a942918390ee20861a7ec64c1124f07af2dc98b273864\": container with ID starting with ab07af679b445dbd637a942918390ee20861a7ec64c1124f07af2dc98b273864 not found: ID does not exist" containerID="ab07af679b445dbd637a942918390ee20861a7ec64c1124f07af2dc98b273864" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.362315 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab07af679b445dbd637a942918390ee20861a7ec64c1124f07af2dc98b273864"} err="failed to get container status \"ab07af679b445dbd637a942918390ee20861a7ec64c1124f07af2dc98b273864\": rpc error: code = NotFound desc = could not find container \"ab07af679b445dbd637a942918390ee20861a7ec64c1124f07af2dc98b273864\": container with ID starting with ab07af679b445dbd637a942918390ee20861a7ec64c1124f07af2dc98b273864 not found: ID does not exist" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.362342 4704 scope.go:117] "RemoveContainer" containerID="31f1678ff3eb9f0d68fbd3dd2ebc80d4aca53e6bb0a84252a68d482228d77fca" Nov 25 15:40:28 crc kubenswrapper[4704]: E1125 15:40:28.362581 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31f1678ff3eb9f0d68fbd3dd2ebc80d4aca53e6bb0a84252a68d482228d77fca\": container with ID starting with 31f1678ff3eb9f0d68fbd3dd2ebc80d4aca53e6bb0a84252a68d482228d77fca not found: ID does not exist" containerID="31f1678ff3eb9f0d68fbd3dd2ebc80d4aca53e6bb0a84252a68d482228d77fca" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.362602 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31f1678ff3eb9f0d68fbd3dd2ebc80d4aca53e6bb0a84252a68d482228d77fca"} err="failed to get container status \"31f1678ff3eb9f0d68fbd3dd2ebc80d4aca53e6bb0a84252a68d482228d77fca\": rpc error: code = NotFound desc = could not find container \"31f1678ff3eb9f0d68fbd3dd2ebc80d4aca53e6bb0a84252a68d482228d77fca\": container with ID starting with 31f1678ff3eb9f0d68fbd3dd2ebc80d4aca53e6bb0a84252a68d482228d77fca not found: ID does not exist" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.362617 4704 scope.go:117] "RemoveContainer" containerID="adb952b1edb22107b9d420a4dc1bd85e8c68d180c10c50a6deed2ee7490a95d7" Nov 25 15:40:28 crc kubenswrapper[4704]: E1125 15:40:28.362802 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adb952b1edb22107b9d420a4dc1bd85e8c68d180c10c50a6deed2ee7490a95d7\": container with ID starting with adb952b1edb22107b9d420a4dc1bd85e8c68d180c10c50a6deed2ee7490a95d7 not found: ID does not exist" containerID="adb952b1edb22107b9d420a4dc1bd85e8c68d180c10c50a6deed2ee7490a95d7" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.362820 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adb952b1edb22107b9d420a4dc1bd85e8c68d180c10c50a6deed2ee7490a95d7"} err="failed to get container status \"adb952b1edb22107b9d420a4dc1bd85e8c68d180c10c50a6deed2ee7490a95d7\": rpc error: code = NotFound desc = could not find container \"adb952b1edb22107b9d420a4dc1bd85e8c68d180c10c50a6deed2ee7490a95d7\": container with ID starting with adb952b1edb22107b9d420a4dc1bd85e8c68d180c10c50a6deed2ee7490a95d7 not found: ID does not exist" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.362832 4704 scope.go:117] "RemoveContainer" containerID="1a6d289ac1aa1275bf04bb68f32c952870c145ca1dff24f829e7be3e291711e4" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.374972 4704 scope.go:117] "RemoveContainer" containerID="1a6d289ac1aa1275bf04bb68f32c952870c145ca1dff24f829e7be3e291711e4" Nov 25 15:40:28 crc kubenswrapper[4704]: E1125 15:40:28.375458 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a6d289ac1aa1275bf04bb68f32c952870c145ca1dff24f829e7be3e291711e4\": container with ID starting with 1a6d289ac1aa1275bf04bb68f32c952870c145ca1dff24f829e7be3e291711e4 not found: ID does not exist" containerID="1a6d289ac1aa1275bf04bb68f32c952870c145ca1dff24f829e7be3e291711e4" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.375530 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a6d289ac1aa1275bf04bb68f32c952870c145ca1dff24f829e7be3e291711e4"} err="failed to get container status \"1a6d289ac1aa1275bf04bb68f32c952870c145ca1dff24f829e7be3e291711e4\": rpc error: code = NotFound desc = could not find container \"1a6d289ac1aa1275bf04bb68f32c952870c145ca1dff24f829e7be3e291711e4\": container with ID starting with 1a6d289ac1aa1275bf04bb68f32c952870c145ca1dff24f829e7be3e291711e4 not found: ID does not exist" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.375556 4704 scope.go:117] "RemoveContainer" containerID="778b173d28098e0b96b3b2c5e7567794f5752393c66df47e0144e1c225f5f314" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.399645 4704 scope.go:117] "RemoveContainer" containerID="5ea26e846b666fdbd0d1da572245ad3e4b5709758d8a44776cd67eb606d3c8a8" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.401219 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-crs88"] Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.415707 4704 scope.go:117] "RemoveContainer" containerID="1602d8c38f0b068631e8b40e66c5d572f5ad77fd1fa9014d89f26422eac0bec1" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.426048 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09422116-4570-4f3b-bde3-aaebdb318c47" path="/var/lib/kubelet/pods/09422116-4570-4f3b-bde3-aaebdb318c47/volumes" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.432924 4704 scope.go:117] "RemoveContainer" containerID="778b173d28098e0b96b3b2c5e7567794f5752393c66df47e0144e1c225f5f314" Nov 25 15:40:28 crc kubenswrapper[4704]: E1125 15:40:28.433362 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"778b173d28098e0b96b3b2c5e7567794f5752393c66df47e0144e1c225f5f314\": container with ID starting with 778b173d28098e0b96b3b2c5e7567794f5752393c66df47e0144e1c225f5f314 not found: ID does not exist" containerID="778b173d28098e0b96b3b2c5e7567794f5752393c66df47e0144e1c225f5f314" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.433678 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"778b173d28098e0b96b3b2c5e7567794f5752393c66df47e0144e1c225f5f314"} err="failed to get container status \"778b173d28098e0b96b3b2c5e7567794f5752393c66df47e0144e1c225f5f314\": rpc error: code = NotFound desc = could not find container \"778b173d28098e0b96b3b2c5e7567794f5752393c66df47e0144e1c225f5f314\": container with ID starting with 778b173d28098e0b96b3b2c5e7567794f5752393c66df47e0144e1c225f5f314 not found: ID does not exist" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.433718 4704 scope.go:117] "RemoveContainer" containerID="5ea26e846b666fdbd0d1da572245ad3e4b5709758d8a44776cd67eb606d3c8a8" Nov 25 15:40:28 crc kubenswrapper[4704]: E1125 15:40:28.434047 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ea26e846b666fdbd0d1da572245ad3e4b5709758d8a44776cd67eb606d3c8a8\": container with ID starting with 5ea26e846b666fdbd0d1da572245ad3e4b5709758d8a44776cd67eb606d3c8a8 not found: ID does not exist" containerID="5ea26e846b666fdbd0d1da572245ad3e4b5709758d8a44776cd67eb606d3c8a8" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.434076 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ea26e846b666fdbd0d1da572245ad3e4b5709758d8a44776cd67eb606d3c8a8"} err="failed to get container status \"5ea26e846b666fdbd0d1da572245ad3e4b5709758d8a44776cd67eb606d3c8a8\": rpc error: code = NotFound desc = could not find container \"5ea26e846b666fdbd0d1da572245ad3e4b5709758d8a44776cd67eb606d3c8a8\": container with ID starting with 5ea26e846b666fdbd0d1da572245ad3e4b5709758d8a44776cd67eb606d3c8a8 not found: ID does not exist" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.434093 4704 scope.go:117] "RemoveContainer" containerID="1602d8c38f0b068631e8b40e66c5d572f5ad77fd1fa9014d89f26422eac0bec1" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.434153 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36f52ec1-c7de-4345-bc4a-4d8fc6f182fe" path="/var/lib/kubelet/pods/36f52ec1-c7de-4345-bc4a-4d8fc6f182fe/volumes" Nov 25 15:40:28 crc kubenswrapper[4704]: E1125 15:40:28.434345 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1602d8c38f0b068631e8b40e66c5d572f5ad77fd1fa9014d89f26422eac0bec1\": container with ID starting with 1602d8c38f0b068631e8b40e66c5d572f5ad77fd1fa9014d89f26422eac0bec1 not found: ID does not exist" containerID="1602d8c38f0b068631e8b40e66c5d572f5ad77fd1fa9014d89f26422eac0bec1" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.434372 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1602d8c38f0b068631e8b40e66c5d572f5ad77fd1fa9014d89f26422eac0bec1"} err="failed to get container status \"1602d8c38f0b068631e8b40e66c5d572f5ad77fd1fa9014d89f26422eac0bec1\": rpc error: code = NotFound desc = could not find container \"1602d8c38f0b068631e8b40e66c5d572f5ad77fd1fa9014d89f26422eac0bec1\": container with ID starting with 1602d8c38f0b068631e8b40e66c5d572f5ad77fd1fa9014d89f26422eac0bec1 not found: ID does not exist" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.439415 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6193bcc6-1da4-414c-84df-92b1bead0762" path="/var/lib/kubelet/pods/6193bcc6-1da4-414c-84df-92b1bead0762/volumes" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.442912 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdb3b0c5-af6c-4d36-bba3-5a8419a72107" path="/var/lib/kubelet/pods/cdb3b0c5-af6c-4d36-bba3-5a8419a72107/volumes" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.645292 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mx4sl" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.717190 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmfxk\" (UniqueName: \"kubernetes.io/projected/260ec6a9-8914-49dc-8cd8-95c8fa30a29a-kube-api-access-wmfxk\") pod \"260ec6a9-8914-49dc-8cd8-95c8fa30a29a\" (UID: \"260ec6a9-8914-49dc-8cd8-95c8fa30a29a\") " Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.717300 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/260ec6a9-8914-49dc-8cd8-95c8fa30a29a-utilities\") pod \"260ec6a9-8914-49dc-8cd8-95c8fa30a29a\" (UID: \"260ec6a9-8914-49dc-8cd8-95c8fa30a29a\") " Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.717450 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/260ec6a9-8914-49dc-8cd8-95c8fa30a29a-catalog-content\") pod \"260ec6a9-8914-49dc-8cd8-95c8fa30a29a\" (UID: \"260ec6a9-8914-49dc-8cd8-95c8fa30a29a\") " Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.718864 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/260ec6a9-8914-49dc-8cd8-95c8fa30a29a-utilities" (OuterVolumeSpecName: "utilities") pod "260ec6a9-8914-49dc-8cd8-95c8fa30a29a" (UID: "260ec6a9-8914-49dc-8cd8-95c8fa30a29a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.732046 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/260ec6a9-8914-49dc-8cd8-95c8fa30a29a-kube-api-access-wmfxk" (OuterVolumeSpecName: "kube-api-access-wmfxk") pod "260ec6a9-8914-49dc-8cd8-95c8fa30a29a" (UID: "260ec6a9-8914-49dc-8cd8-95c8fa30a29a"). InnerVolumeSpecName "kube-api-access-wmfxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.776052 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/260ec6a9-8914-49dc-8cd8-95c8fa30a29a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "260ec6a9-8914-49dc-8cd8-95c8fa30a29a" (UID: "260ec6a9-8914-49dc-8cd8-95c8fa30a29a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.819348 4704 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/260ec6a9-8914-49dc-8cd8-95c8fa30a29a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.819398 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmfxk\" (UniqueName: \"kubernetes.io/projected/260ec6a9-8914-49dc-8cd8-95c8fa30a29a-kube-api-access-wmfxk\") on node \"crc\" DevicePath \"\"" Nov 25 15:40:28 crc kubenswrapper[4704]: I1125 15:40:28.819412 4704 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/260ec6a9-8914-49dc-8cd8-95c8fa30a29a-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.246139 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-crs88" event={"ID":"54e9da8e-917b-4a46-9fe9-725f950fced1","Type":"ContainerStarted","Data":"7ff695db8104e4f336650bd938a97a4f6af48ad11f956a1509947fd51c5dc59a"} Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.246687 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-crs88" event={"ID":"54e9da8e-917b-4a46-9fe9-725f950fced1","Type":"ContainerStarted","Data":"01f46bc2675eb32a6d4ac85853d5ee2753e84192f1e82155bba6ff7d9f093323"} Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.246756 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-crs88" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.252595 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mx4sl" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.255338 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mx4sl" event={"ID":"260ec6a9-8914-49dc-8cd8-95c8fa30a29a","Type":"ContainerDied","Data":"7dd3c26fef1b6ab2f864fa116cb35c419bb3632be889a464cc8eda23909f673c"} Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.255566 4704 scope.go:117] "RemoveContainer" containerID="c0c1b8a4b3ec9058a4f9588721d8dd59ee55306bd1432a95d00bc52e270a5e2b" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.259768 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-crs88" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.277213 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-crs88" podStartSLOduration=2.277183482 podStartE2EDuration="2.277183482s" podCreationTimestamp="2025-11-25 15:40:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:40:29.274867004 +0000 UTC m=+315.543140805" watchObservedRunningTime="2025-11-25 15:40:29.277183482 +0000 UTC m=+315.545457263" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.297355 4704 scope.go:117] "RemoveContainer" containerID="7e3cdf7f960d79183ec1e8e72702ba7dc4d18419e8edfa59a5a5842172fcec54" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.332978 4704 scope.go:117] "RemoveContainer" containerID="70beef7ddbd2aea21db1fe5fbf4694740e9a17afb3a13de4efb169ed39636fa7" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.339639 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mx4sl"] Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.346973 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mx4sl"] Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.747835 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zqxfm"] Nov 25 15:40:29 crc kubenswrapper[4704]: E1125 15:40:29.748145 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36f52ec1-c7de-4345-bc4a-4d8fc6f182fe" containerName="extract-utilities" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.748162 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="36f52ec1-c7de-4345-bc4a-4d8fc6f182fe" containerName="extract-utilities" Nov 25 15:40:29 crc kubenswrapper[4704]: E1125 15:40:29.748188 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09422116-4570-4f3b-bde3-aaebdb318c47" containerName="registry-server" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.748196 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="09422116-4570-4f3b-bde3-aaebdb318c47" containerName="registry-server" Nov 25 15:40:29 crc kubenswrapper[4704]: E1125 15:40:29.748215 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdb3b0c5-af6c-4d36-bba3-5a8419a72107" containerName="extract-content" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.748227 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdb3b0c5-af6c-4d36-bba3-5a8419a72107" containerName="extract-content" Nov 25 15:40:29 crc kubenswrapper[4704]: E1125 15:40:29.748245 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="260ec6a9-8914-49dc-8cd8-95c8fa30a29a" containerName="registry-server" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.748282 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="260ec6a9-8914-49dc-8cd8-95c8fa30a29a" containerName="registry-server" Nov 25 15:40:29 crc kubenswrapper[4704]: E1125 15:40:29.748298 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6193bcc6-1da4-414c-84df-92b1bead0762" containerName="marketplace-operator" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.748310 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="6193bcc6-1da4-414c-84df-92b1bead0762" containerName="marketplace-operator" Nov 25 15:40:29 crc kubenswrapper[4704]: E1125 15:40:29.748320 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdb3b0c5-af6c-4d36-bba3-5a8419a72107" containerName="extract-utilities" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.748327 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdb3b0c5-af6c-4d36-bba3-5a8419a72107" containerName="extract-utilities" Nov 25 15:40:29 crc kubenswrapper[4704]: E1125 15:40:29.748365 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09422116-4570-4f3b-bde3-aaebdb318c47" containerName="extract-utilities" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.748374 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="09422116-4570-4f3b-bde3-aaebdb318c47" containerName="extract-utilities" Nov 25 15:40:29 crc kubenswrapper[4704]: E1125 15:40:29.748386 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdb3b0c5-af6c-4d36-bba3-5a8419a72107" containerName="registry-server" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.748392 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdb3b0c5-af6c-4d36-bba3-5a8419a72107" containerName="registry-server" Nov 25 15:40:29 crc kubenswrapper[4704]: E1125 15:40:29.748401 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="260ec6a9-8914-49dc-8cd8-95c8fa30a29a" containerName="extract-content" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.748407 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="260ec6a9-8914-49dc-8cd8-95c8fa30a29a" containerName="extract-content" Nov 25 15:40:29 crc kubenswrapper[4704]: E1125 15:40:29.748413 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09422116-4570-4f3b-bde3-aaebdb318c47" containerName="extract-content" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.748441 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="09422116-4570-4f3b-bde3-aaebdb318c47" containerName="extract-content" Nov 25 15:40:29 crc kubenswrapper[4704]: E1125 15:40:29.748452 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36f52ec1-c7de-4345-bc4a-4d8fc6f182fe" containerName="registry-server" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.748458 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="36f52ec1-c7de-4345-bc4a-4d8fc6f182fe" containerName="registry-server" Nov 25 15:40:29 crc kubenswrapper[4704]: E1125 15:40:29.748469 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36f52ec1-c7de-4345-bc4a-4d8fc6f182fe" containerName="extract-content" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.748476 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="36f52ec1-c7de-4345-bc4a-4d8fc6f182fe" containerName="extract-content" Nov 25 15:40:29 crc kubenswrapper[4704]: E1125 15:40:29.748484 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="260ec6a9-8914-49dc-8cd8-95c8fa30a29a" containerName="extract-utilities" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.748491 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="260ec6a9-8914-49dc-8cd8-95c8fa30a29a" containerName="extract-utilities" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.749633 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="6193bcc6-1da4-414c-84df-92b1bead0762" containerName="marketplace-operator" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.749648 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="260ec6a9-8914-49dc-8cd8-95c8fa30a29a" containerName="registry-server" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.749658 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="09422116-4570-4f3b-bde3-aaebdb318c47" containerName="registry-server" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.749668 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="36f52ec1-c7de-4345-bc4a-4d8fc6f182fe" containerName="registry-server" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.749700 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdb3b0c5-af6c-4d36-bba3-5a8419a72107" containerName="registry-server" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.751221 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zqxfm" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.753087 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.757822 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zqxfm"] Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.831668 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4cw7\" (UniqueName: \"kubernetes.io/projected/c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13-kube-api-access-f4cw7\") pod \"redhat-marketplace-zqxfm\" (UID: \"c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13\") " pod="openshift-marketplace/redhat-marketplace-zqxfm" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.831749 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13-catalog-content\") pod \"redhat-marketplace-zqxfm\" (UID: \"c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13\") " pod="openshift-marketplace/redhat-marketplace-zqxfm" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.831810 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13-utilities\") pod \"redhat-marketplace-zqxfm\" (UID: \"c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13\") " pod="openshift-marketplace/redhat-marketplace-zqxfm" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.933481 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4cw7\" (UniqueName: \"kubernetes.io/projected/c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13-kube-api-access-f4cw7\") pod \"redhat-marketplace-zqxfm\" (UID: \"c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13\") " pod="openshift-marketplace/redhat-marketplace-zqxfm" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.933558 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13-catalog-content\") pod \"redhat-marketplace-zqxfm\" (UID: \"c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13\") " pod="openshift-marketplace/redhat-marketplace-zqxfm" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.933813 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13-utilities\") pod \"redhat-marketplace-zqxfm\" (UID: \"c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13\") " pod="openshift-marketplace/redhat-marketplace-zqxfm" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.934384 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13-utilities\") pod \"redhat-marketplace-zqxfm\" (UID: \"c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13\") " pod="openshift-marketplace/redhat-marketplace-zqxfm" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.935086 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13-catalog-content\") pod \"redhat-marketplace-zqxfm\" (UID: \"c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13\") " pod="openshift-marketplace/redhat-marketplace-zqxfm" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.949500 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qm54c"] Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.950552 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qm54c" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.952811 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.959375 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4cw7\" (UniqueName: \"kubernetes.io/projected/c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13-kube-api-access-f4cw7\") pod \"redhat-marketplace-zqxfm\" (UID: \"c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13\") " pod="openshift-marketplace/redhat-marketplace-zqxfm" Nov 25 15:40:29 crc kubenswrapper[4704]: I1125 15:40:29.972158 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qm54c"] Nov 25 15:40:30 crc kubenswrapper[4704]: I1125 15:40:30.034761 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4912dfbf-fbd9-41d7-aba3-0a02558ab662-utilities\") pod \"certified-operators-qm54c\" (UID: \"4912dfbf-fbd9-41d7-aba3-0a02558ab662\") " pod="openshift-marketplace/certified-operators-qm54c" Nov 25 15:40:30 crc kubenswrapper[4704]: I1125 15:40:30.034891 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4912dfbf-fbd9-41d7-aba3-0a02558ab662-catalog-content\") pod \"certified-operators-qm54c\" (UID: \"4912dfbf-fbd9-41d7-aba3-0a02558ab662\") " pod="openshift-marketplace/certified-operators-qm54c" Nov 25 15:40:30 crc kubenswrapper[4704]: I1125 15:40:30.034938 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cvhx\" (UniqueName: \"kubernetes.io/projected/4912dfbf-fbd9-41d7-aba3-0a02558ab662-kube-api-access-4cvhx\") pod \"certified-operators-qm54c\" (UID: \"4912dfbf-fbd9-41d7-aba3-0a02558ab662\") " pod="openshift-marketplace/certified-operators-qm54c" Nov 25 15:40:30 crc kubenswrapper[4704]: I1125 15:40:30.078288 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zqxfm" Nov 25 15:40:30 crc kubenswrapper[4704]: I1125 15:40:30.136126 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4912dfbf-fbd9-41d7-aba3-0a02558ab662-catalog-content\") pod \"certified-operators-qm54c\" (UID: \"4912dfbf-fbd9-41d7-aba3-0a02558ab662\") " pod="openshift-marketplace/certified-operators-qm54c" Nov 25 15:40:30 crc kubenswrapper[4704]: I1125 15:40:30.136610 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cvhx\" (UniqueName: \"kubernetes.io/projected/4912dfbf-fbd9-41d7-aba3-0a02558ab662-kube-api-access-4cvhx\") pod \"certified-operators-qm54c\" (UID: \"4912dfbf-fbd9-41d7-aba3-0a02558ab662\") " pod="openshift-marketplace/certified-operators-qm54c" Nov 25 15:40:30 crc kubenswrapper[4704]: I1125 15:40:30.136649 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4912dfbf-fbd9-41d7-aba3-0a02558ab662-utilities\") pod \"certified-operators-qm54c\" (UID: \"4912dfbf-fbd9-41d7-aba3-0a02558ab662\") " pod="openshift-marketplace/certified-operators-qm54c" Nov 25 15:40:30 crc kubenswrapper[4704]: I1125 15:40:30.136727 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4912dfbf-fbd9-41d7-aba3-0a02558ab662-catalog-content\") pod \"certified-operators-qm54c\" (UID: \"4912dfbf-fbd9-41d7-aba3-0a02558ab662\") " pod="openshift-marketplace/certified-operators-qm54c" Nov 25 15:40:30 crc kubenswrapper[4704]: I1125 15:40:30.137129 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4912dfbf-fbd9-41d7-aba3-0a02558ab662-utilities\") pod \"certified-operators-qm54c\" (UID: \"4912dfbf-fbd9-41d7-aba3-0a02558ab662\") " pod="openshift-marketplace/certified-operators-qm54c" Nov 25 15:40:30 crc kubenswrapper[4704]: I1125 15:40:30.168325 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cvhx\" (UniqueName: \"kubernetes.io/projected/4912dfbf-fbd9-41d7-aba3-0a02558ab662-kube-api-access-4cvhx\") pod \"certified-operators-qm54c\" (UID: \"4912dfbf-fbd9-41d7-aba3-0a02558ab662\") " pod="openshift-marketplace/certified-operators-qm54c" Nov 25 15:40:30 crc kubenswrapper[4704]: I1125 15:40:30.287467 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qm54c" Nov 25 15:40:30 crc kubenswrapper[4704]: I1125 15:40:30.423275 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="260ec6a9-8914-49dc-8cd8-95c8fa30a29a" path="/var/lib/kubelet/pods/260ec6a9-8914-49dc-8cd8-95c8fa30a29a/volumes" Nov 25 15:40:30 crc kubenswrapper[4704]: I1125 15:40:30.474403 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zqxfm"] Nov 25 15:40:30 crc kubenswrapper[4704]: W1125 15:40:30.484940 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0dcd3b2_1c21_4b62_8dc1_9c5ef2ee7a13.slice/crio-41308ec2d5a930b32983e23d4ca5befbc814d35f3770d05089bcb930b98269c0 WatchSource:0}: Error finding container 41308ec2d5a930b32983e23d4ca5befbc814d35f3770d05089bcb930b98269c0: Status 404 returned error can't find the container with id 41308ec2d5a930b32983e23d4ca5befbc814d35f3770d05089bcb930b98269c0 Nov 25 15:40:30 crc kubenswrapper[4704]: I1125 15:40:30.673973 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qm54c"] Nov 25 15:40:30 crc kubenswrapper[4704]: W1125 15:40:30.683451 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4912dfbf_fbd9_41d7_aba3_0a02558ab662.slice/crio-307e92bde656f4f07ab8249bb01b2ec0fdc2b0443e7718336916600bff87321f WatchSource:0}: Error finding container 307e92bde656f4f07ab8249bb01b2ec0fdc2b0443e7718336916600bff87321f: Status 404 returned error can't find the container with id 307e92bde656f4f07ab8249bb01b2ec0fdc2b0443e7718336916600bff87321f Nov 25 15:40:31 crc kubenswrapper[4704]: I1125 15:40:31.269642 4704 generic.go:334] "Generic (PLEG): container finished" podID="4912dfbf-fbd9-41d7-aba3-0a02558ab662" containerID="0b87491e6fb12d213bf819bae500ee52499f9256448e75a367379f5060a23337" exitCode=0 Nov 25 15:40:31 crc kubenswrapper[4704]: I1125 15:40:31.269762 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qm54c" event={"ID":"4912dfbf-fbd9-41d7-aba3-0a02558ab662","Type":"ContainerDied","Data":"0b87491e6fb12d213bf819bae500ee52499f9256448e75a367379f5060a23337"} Nov 25 15:40:31 crc kubenswrapper[4704]: I1125 15:40:31.269842 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qm54c" event={"ID":"4912dfbf-fbd9-41d7-aba3-0a02558ab662","Type":"ContainerStarted","Data":"307e92bde656f4f07ab8249bb01b2ec0fdc2b0443e7718336916600bff87321f"} Nov 25 15:40:31 crc kubenswrapper[4704]: I1125 15:40:31.273655 4704 generic.go:334] "Generic (PLEG): container finished" podID="c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13" containerID="97164e50022778714d1ba356b07c599829bdd97c29f2b924264d88ecebed4415" exitCode=0 Nov 25 15:40:31 crc kubenswrapper[4704]: I1125 15:40:31.273731 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zqxfm" event={"ID":"c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13","Type":"ContainerDied","Data":"97164e50022778714d1ba356b07c599829bdd97c29f2b924264d88ecebed4415"} Nov 25 15:40:31 crc kubenswrapper[4704]: I1125 15:40:31.275005 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zqxfm" event={"ID":"c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13","Type":"ContainerStarted","Data":"41308ec2d5a930b32983e23d4ca5befbc814d35f3770d05089bcb930b98269c0"} Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.148303 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s7lr8"] Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.149478 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s7lr8" Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.153414 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.157525 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s7lr8"] Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.264633 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f11b0bae-8584-41bd-9970-af1f50073c21-catalog-content\") pod \"community-operators-s7lr8\" (UID: \"f11b0bae-8584-41bd-9970-af1f50073c21\") " pod="openshift-marketplace/community-operators-s7lr8" Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.265073 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhr6h\" (UniqueName: \"kubernetes.io/projected/f11b0bae-8584-41bd-9970-af1f50073c21-kube-api-access-zhr6h\") pod \"community-operators-s7lr8\" (UID: \"f11b0bae-8584-41bd-9970-af1f50073c21\") " pod="openshift-marketplace/community-operators-s7lr8" Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.265230 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f11b0bae-8584-41bd-9970-af1f50073c21-utilities\") pod \"community-operators-s7lr8\" (UID: \"f11b0bae-8584-41bd-9970-af1f50073c21\") " pod="openshift-marketplace/community-operators-s7lr8" Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.281058 4704 generic.go:334] "Generic (PLEG): container finished" podID="4912dfbf-fbd9-41d7-aba3-0a02558ab662" containerID="687661eca4e25e51cb59f3e966b190320474e516b1ac7aea2926fbb49e7a2d08" exitCode=0 Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.281159 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qm54c" event={"ID":"4912dfbf-fbd9-41d7-aba3-0a02558ab662","Type":"ContainerDied","Data":"687661eca4e25e51cb59f3e966b190320474e516b1ac7aea2926fbb49e7a2d08"} Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.283360 4704 generic.go:334] "Generic (PLEG): container finished" podID="c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13" containerID="8f6da216e5520da4cb7b76d1852ac11ca03cf02b8c6a4286f30e029c22fef502" exitCode=0 Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.283404 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zqxfm" event={"ID":"c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13","Type":"ContainerDied","Data":"8f6da216e5520da4cb7b76d1852ac11ca03cf02b8c6a4286f30e029c22fef502"} Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.352884 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-827sg"] Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.355964 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-827sg" Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.358401 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.360999 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-827sg"] Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.367118 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f11b0bae-8584-41bd-9970-af1f50073c21-catalog-content\") pod \"community-operators-s7lr8\" (UID: \"f11b0bae-8584-41bd-9970-af1f50073c21\") " pod="openshift-marketplace/community-operators-s7lr8" Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.367634 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhr6h\" (UniqueName: \"kubernetes.io/projected/f11b0bae-8584-41bd-9970-af1f50073c21-kube-api-access-zhr6h\") pod \"community-operators-s7lr8\" (UID: \"f11b0bae-8584-41bd-9970-af1f50073c21\") " pod="openshift-marketplace/community-operators-s7lr8" Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.367727 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f11b0bae-8584-41bd-9970-af1f50073c21-catalog-content\") pod \"community-operators-s7lr8\" (UID: \"f11b0bae-8584-41bd-9970-af1f50073c21\") " pod="openshift-marketplace/community-operators-s7lr8" Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.367889 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f11b0bae-8584-41bd-9970-af1f50073c21-utilities\") pod \"community-operators-s7lr8\" (UID: \"f11b0bae-8584-41bd-9970-af1f50073c21\") " pod="openshift-marketplace/community-operators-s7lr8" Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.368108 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f11b0bae-8584-41bd-9970-af1f50073c21-utilities\") pod \"community-operators-s7lr8\" (UID: \"f11b0bae-8584-41bd-9970-af1f50073c21\") " pod="openshift-marketplace/community-operators-s7lr8" Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.390807 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhr6h\" (UniqueName: \"kubernetes.io/projected/f11b0bae-8584-41bd-9970-af1f50073c21-kube-api-access-zhr6h\") pod \"community-operators-s7lr8\" (UID: \"f11b0bae-8584-41bd-9970-af1f50073c21\") " pod="openshift-marketplace/community-operators-s7lr8" Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.469474 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81c4aee6-0cc5-4f75-98b9-d546819ce1df-catalog-content\") pod \"redhat-operators-827sg\" (UID: \"81c4aee6-0cc5-4f75-98b9-d546819ce1df\") " pod="openshift-marketplace/redhat-operators-827sg" Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.469584 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81c4aee6-0cc5-4f75-98b9-d546819ce1df-utilities\") pod \"redhat-operators-827sg\" (UID: \"81c4aee6-0cc5-4f75-98b9-d546819ce1df\") " pod="openshift-marketplace/redhat-operators-827sg" Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.469622 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n58cz\" (UniqueName: \"kubernetes.io/projected/81c4aee6-0cc5-4f75-98b9-d546819ce1df-kube-api-access-n58cz\") pod \"redhat-operators-827sg\" (UID: \"81c4aee6-0cc5-4f75-98b9-d546819ce1df\") " pod="openshift-marketplace/redhat-operators-827sg" Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.483347 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s7lr8" Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.573454 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81c4aee6-0cc5-4f75-98b9-d546819ce1df-catalog-content\") pod \"redhat-operators-827sg\" (UID: \"81c4aee6-0cc5-4f75-98b9-d546819ce1df\") " pod="openshift-marketplace/redhat-operators-827sg" Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.573530 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81c4aee6-0cc5-4f75-98b9-d546819ce1df-utilities\") pod \"redhat-operators-827sg\" (UID: \"81c4aee6-0cc5-4f75-98b9-d546819ce1df\") " pod="openshift-marketplace/redhat-operators-827sg" Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.573552 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n58cz\" (UniqueName: \"kubernetes.io/projected/81c4aee6-0cc5-4f75-98b9-d546819ce1df-kube-api-access-n58cz\") pod \"redhat-operators-827sg\" (UID: \"81c4aee6-0cc5-4f75-98b9-d546819ce1df\") " pod="openshift-marketplace/redhat-operators-827sg" Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.574885 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81c4aee6-0cc5-4f75-98b9-d546819ce1df-catalog-content\") pod \"redhat-operators-827sg\" (UID: \"81c4aee6-0cc5-4f75-98b9-d546819ce1df\") " pod="openshift-marketplace/redhat-operators-827sg" Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.576352 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81c4aee6-0cc5-4f75-98b9-d546819ce1df-utilities\") pod \"redhat-operators-827sg\" (UID: \"81c4aee6-0cc5-4f75-98b9-d546819ce1df\") " pod="openshift-marketplace/redhat-operators-827sg" Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.597232 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n58cz\" (UniqueName: \"kubernetes.io/projected/81c4aee6-0cc5-4f75-98b9-d546819ce1df-kube-api-access-n58cz\") pod \"redhat-operators-827sg\" (UID: \"81c4aee6-0cc5-4f75-98b9-d546819ce1df\") " pod="openshift-marketplace/redhat-operators-827sg" Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.678417 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s7lr8"] Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.684244 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-827sg" Nov 25 15:40:32 crc kubenswrapper[4704]: W1125 15:40:32.691878 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf11b0bae_8584_41bd_9970_af1f50073c21.slice/crio-9d0590a7a452ed6e8c4cfb9e1a5d9be5f4b30c85c51376c8dbf47ff7bc557c35 WatchSource:0}: Error finding container 9d0590a7a452ed6e8c4cfb9e1a5d9be5f4b30c85c51376c8dbf47ff7bc557c35: Status 404 returned error can't find the container with id 9d0590a7a452ed6e8c4cfb9e1a5d9be5f4b30c85c51376c8dbf47ff7bc557c35 Nov 25 15:40:32 crc kubenswrapper[4704]: I1125 15:40:32.904501 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-827sg"] Nov 25 15:40:32 crc kubenswrapper[4704]: W1125 15:40:32.934178 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81c4aee6_0cc5_4f75_98b9_d546819ce1df.slice/crio-9f742d32c6d17c9a964a27897474aa72cc516edfaa9438536757c6a0893f35af WatchSource:0}: Error finding container 9f742d32c6d17c9a964a27897474aa72cc516edfaa9438536757c6a0893f35af: Status 404 returned error can't find the container with id 9f742d32c6d17c9a964a27897474aa72cc516edfaa9438536757c6a0893f35af Nov 25 15:40:33 crc kubenswrapper[4704]: I1125 15:40:33.291230 4704 generic.go:334] "Generic (PLEG): container finished" podID="81c4aee6-0cc5-4f75-98b9-d546819ce1df" containerID="fefe88469fb3306ce64ce2101462edf639a2195ebe54d0005010268540645adf" exitCode=0 Nov 25 15:40:33 crc kubenswrapper[4704]: I1125 15:40:33.291291 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-827sg" event={"ID":"81c4aee6-0cc5-4f75-98b9-d546819ce1df","Type":"ContainerDied","Data":"fefe88469fb3306ce64ce2101462edf639a2195ebe54d0005010268540645adf"} Nov 25 15:40:33 crc kubenswrapper[4704]: I1125 15:40:33.291370 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-827sg" event={"ID":"81c4aee6-0cc5-4f75-98b9-d546819ce1df","Type":"ContainerStarted","Data":"9f742d32c6d17c9a964a27897474aa72cc516edfaa9438536757c6a0893f35af"} Nov 25 15:40:33 crc kubenswrapper[4704]: I1125 15:40:33.294301 4704 generic.go:334] "Generic (PLEG): container finished" podID="f11b0bae-8584-41bd-9970-af1f50073c21" containerID="38716d9a9ca80648e047f339f9306922268dfb170ca6380b0c2b2fcc80bb6a52" exitCode=0 Nov 25 15:40:33 crc kubenswrapper[4704]: I1125 15:40:33.294367 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7lr8" event={"ID":"f11b0bae-8584-41bd-9970-af1f50073c21","Type":"ContainerDied","Data":"38716d9a9ca80648e047f339f9306922268dfb170ca6380b0c2b2fcc80bb6a52"} Nov 25 15:40:33 crc kubenswrapper[4704]: I1125 15:40:33.294404 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7lr8" event={"ID":"f11b0bae-8584-41bd-9970-af1f50073c21","Type":"ContainerStarted","Data":"9d0590a7a452ed6e8c4cfb9e1a5d9be5f4b30c85c51376c8dbf47ff7bc557c35"} Nov 25 15:40:33 crc kubenswrapper[4704]: I1125 15:40:33.298965 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qm54c" event={"ID":"4912dfbf-fbd9-41d7-aba3-0a02558ab662","Type":"ContainerStarted","Data":"6a7d6d9ed3ba86d43df29b494c296c123910b91200b4e80f15e762f8b740dc13"} Nov 25 15:40:33 crc kubenswrapper[4704]: I1125 15:40:33.302908 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zqxfm" event={"ID":"c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13","Type":"ContainerStarted","Data":"1eca90015d49ca8ee2f46dc9781a7a95289c897a034826be6b2aa8f4ff9348c7"} Nov 25 15:40:33 crc kubenswrapper[4704]: I1125 15:40:33.334101 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zqxfm" podStartSLOduration=2.8591575369999997 podStartE2EDuration="4.334074699s" podCreationTimestamp="2025-11-25 15:40:29 +0000 UTC" firstStartedPulling="2025-11-25 15:40:31.277501008 +0000 UTC m=+317.545774789" lastFinishedPulling="2025-11-25 15:40:32.75241817 +0000 UTC m=+319.020691951" observedRunningTime="2025-11-25 15:40:33.330689669 +0000 UTC m=+319.598963470" watchObservedRunningTime="2025-11-25 15:40:33.334074699 +0000 UTC m=+319.602348480" Nov 25 15:40:33 crc kubenswrapper[4704]: I1125 15:40:33.351826 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qm54c" podStartSLOduration=2.775061848 podStartE2EDuration="4.351808994s" podCreationTimestamp="2025-11-25 15:40:29 +0000 UTC" firstStartedPulling="2025-11-25 15:40:31.274030566 +0000 UTC m=+317.542304357" lastFinishedPulling="2025-11-25 15:40:32.850777722 +0000 UTC m=+319.119051503" observedRunningTime="2025-11-25 15:40:33.351735042 +0000 UTC m=+319.620008833" watchObservedRunningTime="2025-11-25 15:40:33.351808994 +0000 UTC m=+319.620082775" Nov 25 15:40:35 crc kubenswrapper[4704]: I1125 15:40:35.315150 4704 generic.go:334] "Generic (PLEG): container finished" podID="f11b0bae-8584-41bd-9970-af1f50073c21" containerID="63bfb021739a49c6081a195f176a4d62e14fcd6836839d30bafd0b88a63c3d2b" exitCode=0 Nov 25 15:40:35 crc kubenswrapper[4704]: I1125 15:40:35.315464 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7lr8" event={"ID":"f11b0bae-8584-41bd-9970-af1f50073c21","Type":"ContainerDied","Data":"63bfb021739a49c6081a195f176a4d62e14fcd6836839d30bafd0b88a63c3d2b"} Nov 25 15:40:35 crc kubenswrapper[4704]: I1125 15:40:35.318148 4704 generic.go:334] "Generic (PLEG): container finished" podID="81c4aee6-0cc5-4f75-98b9-d546819ce1df" containerID="4fe5b47be27258d7d88a96d7e6d754032d5eb9b9a57f673d6333f8be49d53322" exitCode=0 Nov 25 15:40:35 crc kubenswrapper[4704]: I1125 15:40:35.318186 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-827sg" event={"ID":"81c4aee6-0cc5-4f75-98b9-d546819ce1df","Type":"ContainerDied","Data":"4fe5b47be27258d7d88a96d7e6d754032d5eb9b9a57f673d6333f8be49d53322"} Nov 25 15:40:36 crc kubenswrapper[4704]: I1125 15:40:36.324884 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-827sg" event={"ID":"81c4aee6-0cc5-4f75-98b9-d546819ce1df","Type":"ContainerStarted","Data":"8c76f1fee0562ec0f084e48f0ff55b446605de921fdf464fdc3fb312e7b2cf5e"} Nov 25 15:40:36 crc kubenswrapper[4704]: I1125 15:40:36.329008 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s7lr8" event={"ID":"f11b0bae-8584-41bd-9970-af1f50073c21","Type":"ContainerStarted","Data":"ccaa01647ad0c33d29ddfd65423a53a527910596d708eeba7fb459b88e10a6da"} Nov 25 15:40:36 crc kubenswrapper[4704]: I1125 15:40:36.342851 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-827sg" podStartSLOduration=1.8924766800000001 podStartE2EDuration="4.342828208s" podCreationTimestamp="2025-11-25 15:40:32 +0000 UTC" firstStartedPulling="2025-11-25 15:40:33.293197129 +0000 UTC m=+319.561470910" lastFinishedPulling="2025-11-25 15:40:35.743548657 +0000 UTC m=+322.011822438" observedRunningTime="2025-11-25 15:40:36.340701385 +0000 UTC m=+322.608975166" watchObservedRunningTime="2025-11-25 15:40:36.342828208 +0000 UTC m=+322.611101989" Nov 25 15:40:36 crc kubenswrapper[4704]: I1125 15:40:36.363440 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s7lr8" podStartSLOduration=1.727251478 podStartE2EDuration="4.363413928s" podCreationTimestamp="2025-11-25 15:40:32 +0000 UTC" firstStartedPulling="2025-11-25 15:40:33.298220287 +0000 UTC m=+319.566494068" lastFinishedPulling="2025-11-25 15:40:35.934382737 +0000 UTC m=+322.202656518" observedRunningTime="2025-11-25 15:40:36.360282245 +0000 UTC m=+322.628556026" watchObservedRunningTime="2025-11-25 15:40:36.363413928 +0000 UTC m=+322.631687719" Nov 25 15:40:40 crc kubenswrapper[4704]: I1125 15:40:40.079189 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zqxfm" Nov 25 15:40:40 crc kubenswrapper[4704]: I1125 15:40:40.080037 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zqxfm" Nov 25 15:40:40 crc kubenswrapper[4704]: I1125 15:40:40.138765 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zqxfm" Nov 25 15:40:40 crc kubenswrapper[4704]: I1125 15:40:40.287897 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qm54c" Nov 25 15:40:40 crc kubenswrapper[4704]: I1125 15:40:40.288368 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qm54c" Nov 25 15:40:40 crc kubenswrapper[4704]: I1125 15:40:40.330050 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qm54c" Nov 25 15:40:40 crc kubenswrapper[4704]: I1125 15:40:40.390859 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qm54c" Nov 25 15:40:40 crc kubenswrapper[4704]: I1125 15:40:40.392999 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zqxfm" Nov 25 15:40:42 crc kubenswrapper[4704]: I1125 15:40:42.483464 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s7lr8" Nov 25 15:40:42 crc kubenswrapper[4704]: I1125 15:40:42.484957 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s7lr8" Nov 25 15:40:42 crc kubenswrapper[4704]: I1125 15:40:42.527621 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s7lr8" Nov 25 15:40:42 crc kubenswrapper[4704]: I1125 15:40:42.685333 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-827sg" Nov 25 15:40:42 crc kubenswrapper[4704]: I1125 15:40:42.685637 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-827sg" Nov 25 15:40:42 crc kubenswrapper[4704]: I1125 15:40:42.747875 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-827sg" Nov 25 15:40:43 crc kubenswrapper[4704]: I1125 15:40:43.409993 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s7lr8" Nov 25 15:40:43 crc kubenswrapper[4704]: I1125 15:40:43.411766 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-827sg" Nov 25 15:41:37 crc kubenswrapper[4704]: I1125 15:41:37.964265 4704 patch_prober.go:28] interesting pod/machine-config-daemon-djz8x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:41:37 crc kubenswrapper[4704]: I1125 15:41:37.965405 4704 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:42:07 crc kubenswrapper[4704]: I1125 15:42:07.964030 4704 patch_prober.go:28] interesting pod/machine-config-daemon-djz8x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:42:07 crc kubenswrapper[4704]: I1125 15:42:07.964900 4704 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:42:37 crc kubenswrapper[4704]: I1125 15:42:37.965141 4704 patch_prober.go:28] interesting pod/machine-config-daemon-djz8x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:42:37 crc kubenswrapper[4704]: I1125 15:42:37.966126 4704 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:42:37 crc kubenswrapper[4704]: I1125 15:42:37.966179 4704 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" Nov 25 15:42:37 crc kubenswrapper[4704]: I1125 15:42:37.966842 4704 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf646e20b03b3390aa256db5e03bee5d833cba5b9a37144d98eae89a8816d8d1"} pod="openshift-machine-config-operator/machine-config-daemon-djz8x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 15:42:37 crc kubenswrapper[4704]: I1125 15:42:37.966900 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" containerID="cri-o://bf646e20b03b3390aa256db5e03bee5d833cba5b9a37144d98eae89a8816d8d1" gracePeriod=600 Nov 25 15:42:38 crc kubenswrapper[4704]: I1125 15:42:38.985213 4704 generic.go:334] "Generic (PLEG): container finished" podID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerID="bf646e20b03b3390aa256db5e03bee5d833cba5b9a37144d98eae89a8816d8d1" exitCode=0 Nov 25 15:42:38 crc kubenswrapper[4704]: I1125 15:42:38.985297 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" event={"ID":"91b52682-d008-4b8a-8bc3-26b032d7dc2c","Type":"ContainerDied","Data":"bf646e20b03b3390aa256db5e03bee5d833cba5b9a37144d98eae89a8816d8d1"} Nov 25 15:42:38 crc kubenswrapper[4704]: I1125 15:42:38.986061 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" event={"ID":"91b52682-d008-4b8a-8bc3-26b032d7dc2c","Type":"ContainerStarted","Data":"491da08c31f2f2cd2745fd9b52997ec5a66034a8d558b6b85cbfececf99b972a"} Nov 25 15:42:38 crc kubenswrapper[4704]: I1125 15:42:38.986100 4704 scope.go:117] "RemoveContainer" containerID="7929ba83619d7931b1d300c06bf9091d9e711ae6a14dad101005c5b42d07f675" Nov 25 15:43:30 crc kubenswrapper[4704]: I1125 15:43:29.999761 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-qfw6g"] Nov 25 15:43:30 crc kubenswrapper[4704]: I1125 15:43:30.001511 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" Nov 25 15:43:30 crc kubenswrapper[4704]: I1125 15:43:30.022767 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-qfw6g"] Nov 25 15:43:30 crc kubenswrapper[4704]: I1125 15:43:30.197585 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/43b29c28-2665-4cf5-a693-157444594c00-registry-tls\") pod \"image-registry-66df7c8f76-qfw6g\" (UID: \"43b29c28-2665-4cf5-a693-157444594c00\") " pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" Nov 25 15:43:30 crc kubenswrapper[4704]: I1125 15:43:30.197669 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/43b29c28-2665-4cf5-a693-157444594c00-bound-sa-token\") pod \"image-registry-66df7c8f76-qfw6g\" (UID: \"43b29c28-2665-4cf5-a693-157444594c00\") " pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" Nov 25 15:43:30 crc kubenswrapper[4704]: I1125 15:43:30.197695 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/43b29c28-2665-4cf5-a693-157444594c00-registry-certificates\") pod \"image-registry-66df7c8f76-qfw6g\" (UID: \"43b29c28-2665-4cf5-a693-157444594c00\") " pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" Nov 25 15:43:30 crc kubenswrapper[4704]: I1125 15:43:30.197754 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-qfw6g\" (UID: \"43b29c28-2665-4cf5-a693-157444594c00\") " pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" Nov 25 15:43:30 crc kubenswrapper[4704]: I1125 15:43:30.197879 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/43b29c28-2665-4cf5-a693-157444594c00-installation-pull-secrets\") pod \"image-registry-66df7c8f76-qfw6g\" (UID: \"43b29c28-2665-4cf5-a693-157444594c00\") " pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" Nov 25 15:43:30 crc kubenswrapper[4704]: I1125 15:43:30.197954 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/43b29c28-2665-4cf5-a693-157444594c00-ca-trust-extracted\") pod \"image-registry-66df7c8f76-qfw6g\" (UID: \"43b29c28-2665-4cf5-a693-157444594c00\") " pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" Nov 25 15:43:30 crc kubenswrapper[4704]: I1125 15:43:30.198014 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jglxq\" (UniqueName: \"kubernetes.io/projected/43b29c28-2665-4cf5-a693-157444594c00-kube-api-access-jglxq\") pod \"image-registry-66df7c8f76-qfw6g\" (UID: \"43b29c28-2665-4cf5-a693-157444594c00\") " pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" Nov 25 15:43:30 crc kubenswrapper[4704]: I1125 15:43:30.198064 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/43b29c28-2665-4cf5-a693-157444594c00-trusted-ca\") pod \"image-registry-66df7c8f76-qfw6g\" (UID: \"43b29c28-2665-4cf5-a693-157444594c00\") " pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" Nov 25 15:43:30 crc kubenswrapper[4704]: I1125 15:43:30.220739 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-qfw6g\" (UID: \"43b29c28-2665-4cf5-a693-157444594c00\") " pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" Nov 25 15:43:30 crc kubenswrapper[4704]: I1125 15:43:30.299773 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/43b29c28-2665-4cf5-a693-157444594c00-bound-sa-token\") pod \"image-registry-66df7c8f76-qfw6g\" (UID: \"43b29c28-2665-4cf5-a693-157444594c00\") " pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" Nov 25 15:43:30 crc kubenswrapper[4704]: I1125 15:43:30.299845 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/43b29c28-2665-4cf5-a693-157444594c00-registry-certificates\") pod \"image-registry-66df7c8f76-qfw6g\" (UID: \"43b29c28-2665-4cf5-a693-157444594c00\") " pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" Nov 25 15:43:30 crc kubenswrapper[4704]: I1125 15:43:30.299875 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/43b29c28-2665-4cf5-a693-157444594c00-installation-pull-secrets\") pod \"image-registry-66df7c8f76-qfw6g\" (UID: \"43b29c28-2665-4cf5-a693-157444594c00\") " pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" Nov 25 15:43:30 crc kubenswrapper[4704]: I1125 15:43:30.299892 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/43b29c28-2665-4cf5-a693-157444594c00-ca-trust-extracted\") pod \"image-registry-66df7c8f76-qfw6g\" (UID: \"43b29c28-2665-4cf5-a693-157444594c00\") " pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" Nov 25 15:43:30 crc kubenswrapper[4704]: I1125 15:43:30.299919 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jglxq\" (UniqueName: \"kubernetes.io/projected/43b29c28-2665-4cf5-a693-157444594c00-kube-api-access-jglxq\") pod \"image-registry-66df7c8f76-qfw6g\" (UID: \"43b29c28-2665-4cf5-a693-157444594c00\") " pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" Nov 25 15:43:30 crc kubenswrapper[4704]: I1125 15:43:30.299942 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/43b29c28-2665-4cf5-a693-157444594c00-trusted-ca\") pod \"image-registry-66df7c8f76-qfw6g\" (UID: \"43b29c28-2665-4cf5-a693-157444594c00\") " pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" Nov 25 15:43:30 crc kubenswrapper[4704]: I1125 15:43:30.299966 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/43b29c28-2665-4cf5-a693-157444594c00-registry-tls\") pod \"image-registry-66df7c8f76-qfw6g\" (UID: \"43b29c28-2665-4cf5-a693-157444594c00\") " pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" Nov 25 15:43:30 crc kubenswrapper[4704]: I1125 15:43:30.301443 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/43b29c28-2665-4cf5-a693-157444594c00-registry-certificates\") pod \"image-registry-66df7c8f76-qfw6g\" (UID: \"43b29c28-2665-4cf5-a693-157444594c00\") " pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" Nov 25 15:43:30 crc kubenswrapper[4704]: I1125 15:43:30.301818 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/43b29c28-2665-4cf5-a693-157444594c00-trusted-ca\") pod \"image-registry-66df7c8f76-qfw6g\" (UID: \"43b29c28-2665-4cf5-a693-157444594c00\") " pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" Nov 25 15:43:30 crc kubenswrapper[4704]: I1125 15:43:30.302175 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/43b29c28-2665-4cf5-a693-157444594c00-ca-trust-extracted\") pod \"image-registry-66df7c8f76-qfw6g\" (UID: \"43b29c28-2665-4cf5-a693-157444594c00\") " pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" Nov 25 15:43:30 crc kubenswrapper[4704]: I1125 15:43:30.307883 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/43b29c28-2665-4cf5-a693-157444594c00-installation-pull-secrets\") pod \"image-registry-66df7c8f76-qfw6g\" (UID: \"43b29c28-2665-4cf5-a693-157444594c00\") " pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" Nov 25 15:43:30 crc kubenswrapper[4704]: I1125 15:43:30.322681 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/43b29c28-2665-4cf5-a693-157444594c00-registry-tls\") pod \"image-registry-66df7c8f76-qfw6g\" (UID: \"43b29c28-2665-4cf5-a693-157444594c00\") " pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" Nov 25 15:43:30 crc kubenswrapper[4704]: I1125 15:43:30.324969 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jglxq\" (UniqueName: \"kubernetes.io/projected/43b29c28-2665-4cf5-a693-157444594c00-kube-api-access-jglxq\") pod \"image-registry-66df7c8f76-qfw6g\" (UID: \"43b29c28-2665-4cf5-a693-157444594c00\") " pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" Nov 25 15:43:30 crc kubenswrapper[4704]: I1125 15:43:30.339107 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/43b29c28-2665-4cf5-a693-157444594c00-bound-sa-token\") pod \"image-registry-66df7c8f76-qfw6g\" (UID: \"43b29c28-2665-4cf5-a693-157444594c00\") " pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" Nov 25 15:43:30 crc kubenswrapper[4704]: I1125 15:43:30.622307 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" Nov 25 15:43:30 crc kubenswrapper[4704]: I1125 15:43:30.794329 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-qfw6g"] Nov 25 15:43:31 crc kubenswrapper[4704]: I1125 15:43:31.316089 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" event={"ID":"43b29c28-2665-4cf5-a693-157444594c00","Type":"ContainerStarted","Data":"52661dca9c24d18e3e0a73020c5b3cff3e8609afe80d9a6c7d0a56f59da7a5a7"} Nov 25 15:43:31 crc kubenswrapper[4704]: I1125 15:43:31.316176 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" event={"ID":"43b29c28-2665-4cf5-a693-157444594c00","Type":"ContainerStarted","Data":"b49422435a7b505734475601aee97820b3e372b268ddde3ae104233937ce1b15"} Nov 25 15:43:31 crc kubenswrapper[4704]: I1125 15:43:31.317175 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" Nov 25 15:43:31 crc kubenswrapper[4704]: I1125 15:43:31.340009 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" podStartSLOduration=2.339982318 podStartE2EDuration="2.339982318s" podCreationTimestamp="2025-11-25 15:43:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:43:31.336362512 +0000 UTC m=+497.604636293" watchObservedRunningTime="2025-11-25 15:43:31.339982318 +0000 UTC m=+497.608256109" Nov 25 15:43:50 crc kubenswrapper[4704]: I1125 15:43:50.628268 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-qfw6g" Nov 25 15:43:50 crc kubenswrapper[4704]: I1125 15:43:50.683278 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qb7gf"] Nov 25 15:44:14 crc kubenswrapper[4704]: I1125 15:44:14.608273 4704 scope.go:117] "RemoveContainer" containerID="f341ce98d4e9ecacadbce59d4c73895573b4da41c4a2048a581125d4b78ade3f" Nov 25 15:44:15 crc kubenswrapper[4704]: I1125 15:44:15.727324 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" podUID="97aea51f-7b9e-44f2-a310-8a27cc66f8d9" containerName="registry" containerID="cri-o://3efdca317811d9aa436d3ad8123fef082412fb166be72f895a197a34bf42d286" gracePeriod=30 Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.153600 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.312630 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.312695 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-registry-certificates\") pod \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.312726 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2rbw\" (UniqueName: \"kubernetes.io/projected/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-kube-api-access-s2rbw\") pod \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.312838 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-trusted-ca\") pod \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.312888 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-registry-tls\") pod \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.312935 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-ca-trust-extracted\") pod \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.312956 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-installation-pull-secrets\") pod \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.312982 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-bound-sa-token\") pod \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\" (UID: \"97aea51f-7b9e-44f2-a310-8a27cc66f8d9\") " Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.314024 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "97aea51f-7b9e-44f2-a310-8a27cc66f8d9" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.314015 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "97aea51f-7b9e-44f2-a310-8a27cc66f8d9" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.319414 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "97aea51f-7b9e-44f2-a310-8a27cc66f8d9" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.319737 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-kube-api-access-s2rbw" (OuterVolumeSpecName: "kube-api-access-s2rbw") pod "97aea51f-7b9e-44f2-a310-8a27cc66f8d9" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9"). InnerVolumeSpecName "kube-api-access-s2rbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.321766 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "97aea51f-7b9e-44f2-a310-8a27cc66f8d9" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.323030 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "97aea51f-7b9e-44f2-a310-8a27cc66f8d9" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.325106 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "97aea51f-7b9e-44f2-a310-8a27cc66f8d9" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.330606 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "97aea51f-7b9e-44f2-a310-8a27cc66f8d9" (UID: "97aea51f-7b9e-44f2-a310-8a27cc66f8d9"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.414848 4704 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.415448 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2rbw\" (UniqueName: \"kubernetes.io/projected/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-kube-api-access-s2rbw\") on node \"crc\" DevicePath \"\"" Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.415461 4704 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.415475 4704 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.415484 4704 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.415493 4704 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.415501 4704 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97aea51f-7b9e-44f2-a310-8a27cc66f8d9-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.575612 4704 generic.go:334] "Generic (PLEG): container finished" podID="97aea51f-7b9e-44f2-a310-8a27cc66f8d9" containerID="3efdca317811d9aa436d3ad8123fef082412fb166be72f895a197a34bf42d286" exitCode=0 Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.575674 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" event={"ID":"97aea51f-7b9e-44f2-a310-8a27cc66f8d9","Type":"ContainerDied","Data":"3efdca317811d9aa436d3ad8123fef082412fb166be72f895a197a34bf42d286"} Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.575719 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" event={"ID":"97aea51f-7b9e-44f2-a310-8a27cc66f8d9","Type":"ContainerDied","Data":"4b1eb8a36a5a83a2440695304a99e241e1bae86ffce2fd66da5c1489a342c49d"} Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.575739 4704 scope.go:117] "RemoveContainer" containerID="3efdca317811d9aa436d3ad8123fef082412fb166be72f895a197a34bf42d286" Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.575748 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-qb7gf" Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.605217 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qb7gf"] Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.607488 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qb7gf"] Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.614409 4704 scope.go:117] "RemoveContainer" containerID="3efdca317811d9aa436d3ad8123fef082412fb166be72f895a197a34bf42d286" Nov 25 15:44:16 crc kubenswrapper[4704]: E1125 15:44:16.615279 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3efdca317811d9aa436d3ad8123fef082412fb166be72f895a197a34bf42d286\": container with ID starting with 3efdca317811d9aa436d3ad8123fef082412fb166be72f895a197a34bf42d286 not found: ID does not exist" containerID="3efdca317811d9aa436d3ad8123fef082412fb166be72f895a197a34bf42d286" Nov 25 15:44:16 crc kubenswrapper[4704]: I1125 15:44:16.615352 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3efdca317811d9aa436d3ad8123fef082412fb166be72f895a197a34bf42d286"} err="failed to get container status \"3efdca317811d9aa436d3ad8123fef082412fb166be72f895a197a34bf42d286\": rpc error: code = NotFound desc = could not find container \"3efdca317811d9aa436d3ad8123fef082412fb166be72f895a197a34bf42d286\": container with ID starting with 3efdca317811d9aa436d3ad8123fef082412fb166be72f895a197a34bf42d286 not found: ID does not exist" Nov 25 15:44:18 crc kubenswrapper[4704]: I1125 15:44:18.425163 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97aea51f-7b9e-44f2-a310-8a27cc66f8d9" path="/var/lib/kubelet/pods/97aea51f-7b9e-44f2-a310-8a27cc66f8d9/volumes" Nov 25 15:45:00 crc kubenswrapper[4704]: I1125 15:45:00.132582 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401425-lgcz2"] Nov 25 15:45:00 crc kubenswrapper[4704]: E1125 15:45:00.133664 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97aea51f-7b9e-44f2-a310-8a27cc66f8d9" containerName="registry" Nov 25 15:45:00 crc kubenswrapper[4704]: I1125 15:45:00.133677 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="97aea51f-7b9e-44f2-a310-8a27cc66f8d9" containerName="registry" Nov 25 15:45:00 crc kubenswrapper[4704]: I1125 15:45:00.133779 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="97aea51f-7b9e-44f2-a310-8a27cc66f8d9" containerName="registry" Nov 25 15:45:00 crc kubenswrapper[4704]: I1125 15:45:00.134201 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-lgcz2" Nov 25 15:45:00 crc kubenswrapper[4704]: I1125 15:45:00.136939 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 25 15:45:00 crc kubenswrapper[4704]: I1125 15:45:00.136983 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 25 15:45:00 crc kubenswrapper[4704]: I1125 15:45:00.150198 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401425-lgcz2"] Nov 25 15:45:00 crc kubenswrapper[4704]: I1125 15:45:00.271512 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0a1de9f1-c47b-46ec-9549-b643ddf4ecdd-secret-volume\") pod \"collect-profiles-29401425-lgcz2\" (UID: \"0a1de9f1-c47b-46ec-9549-b643ddf4ecdd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-lgcz2" Nov 25 15:45:00 crc kubenswrapper[4704]: I1125 15:45:00.271612 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsclq\" (UniqueName: \"kubernetes.io/projected/0a1de9f1-c47b-46ec-9549-b643ddf4ecdd-kube-api-access-zsclq\") pod \"collect-profiles-29401425-lgcz2\" (UID: \"0a1de9f1-c47b-46ec-9549-b643ddf4ecdd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-lgcz2" Nov 25 15:45:00 crc kubenswrapper[4704]: I1125 15:45:00.271648 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a1de9f1-c47b-46ec-9549-b643ddf4ecdd-config-volume\") pod \"collect-profiles-29401425-lgcz2\" (UID: \"0a1de9f1-c47b-46ec-9549-b643ddf4ecdd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-lgcz2" Nov 25 15:45:00 crc kubenswrapper[4704]: I1125 15:45:00.372291 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a1de9f1-c47b-46ec-9549-b643ddf4ecdd-config-volume\") pod \"collect-profiles-29401425-lgcz2\" (UID: \"0a1de9f1-c47b-46ec-9549-b643ddf4ecdd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-lgcz2" Nov 25 15:45:00 crc kubenswrapper[4704]: I1125 15:45:00.372387 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0a1de9f1-c47b-46ec-9549-b643ddf4ecdd-secret-volume\") pod \"collect-profiles-29401425-lgcz2\" (UID: \"0a1de9f1-c47b-46ec-9549-b643ddf4ecdd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-lgcz2" Nov 25 15:45:00 crc kubenswrapper[4704]: I1125 15:45:00.372449 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsclq\" (UniqueName: \"kubernetes.io/projected/0a1de9f1-c47b-46ec-9549-b643ddf4ecdd-kube-api-access-zsclq\") pod \"collect-profiles-29401425-lgcz2\" (UID: \"0a1de9f1-c47b-46ec-9549-b643ddf4ecdd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-lgcz2" Nov 25 15:45:00 crc kubenswrapper[4704]: I1125 15:45:00.373656 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a1de9f1-c47b-46ec-9549-b643ddf4ecdd-config-volume\") pod \"collect-profiles-29401425-lgcz2\" (UID: \"0a1de9f1-c47b-46ec-9549-b643ddf4ecdd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-lgcz2" Nov 25 15:45:00 crc kubenswrapper[4704]: I1125 15:45:00.379963 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0a1de9f1-c47b-46ec-9549-b643ddf4ecdd-secret-volume\") pod \"collect-profiles-29401425-lgcz2\" (UID: \"0a1de9f1-c47b-46ec-9549-b643ddf4ecdd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-lgcz2" Nov 25 15:45:00 crc kubenswrapper[4704]: I1125 15:45:00.389533 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsclq\" (UniqueName: \"kubernetes.io/projected/0a1de9f1-c47b-46ec-9549-b643ddf4ecdd-kube-api-access-zsclq\") pod \"collect-profiles-29401425-lgcz2\" (UID: \"0a1de9f1-c47b-46ec-9549-b643ddf4ecdd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-lgcz2" Nov 25 15:45:00 crc kubenswrapper[4704]: I1125 15:45:00.456716 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-lgcz2" Nov 25 15:45:00 crc kubenswrapper[4704]: I1125 15:45:00.653566 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401425-lgcz2"] Nov 25 15:45:00 crc kubenswrapper[4704]: I1125 15:45:00.825734 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-lgcz2" event={"ID":"0a1de9f1-c47b-46ec-9549-b643ddf4ecdd","Type":"ContainerStarted","Data":"0a8e4c6f14ebc984ca213eed49f4bda3f70b1870f5eeef11a4f30d39ba2d5800"} Nov 25 15:45:00 crc kubenswrapper[4704]: I1125 15:45:00.826227 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-lgcz2" event={"ID":"0a1de9f1-c47b-46ec-9549-b643ddf4ecdd","Type":"ContainerStarted","Data":"a5868aa4c6dc1d70c2bc991faebf2e4db4c29193c9071021412db2a16fa10551"} Nov 25 15:45:00 crc kubenswrapper[4704]: I1125 15:45:00.844945 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-lgcz2" podStartSLOduration=0.8449186 podStartE2EDuration="844.9186ms" podCreationTimestamp="2025-11-25 15:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:45:00.842476649 +0000 UTC m=+587.110750450" watchObservedRunningTime="2025-11-25 15:45:00.8449186 +0000 UTC m=+587.113192381" Nov 25 15:45:01 crc kubenswrapper[4704]: I1125 15:45:01.831948 4704 generic.go:334] "Generic (PLEG): container finished" podID="0a1de9f1-c47b-46ec-9549-b643ddf4ecdd" containerID="0a8e4c6f14ebc984ca213eed49f4bda3f70b1870f5eeef11a4f30d39ba2d5800" exitCode=0 Nov 25 15:45:01 crc kubenswrapper[4704]: I1125 15:45:01.831996 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-lgcz2" event={"ID":"0a1de9f1-c47b-46ec-9549-b643ddf4ecdd","Type":"ContainerDied","Data":"0a8e4c6f14ebc984ca213eed49f4bda3f70b1870f5eeef11a4f30d39ba2d5800"} Nov 25 15:45:03 crc kubenswrapper[4704]: I1125 15:45:03.065381 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-lgcz2" Nov 25 15:45:03 crc kubenswrapper[4704]: I1125 15:45:03.207254 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a1de9f1-c47b-46ec-9549-b643ddf4ecdd-config-volume\") pod \"0a1de9f1-c47b-46ec-9549-b643ddf4ecdd\" (UID: \"0a1de9f1-c47b-46ec-9549-b643ddf4ecdd\") " Nov 25 15:45:03 crc kubenswrapper[4704]: I1125 15:45:03.207347 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsclq\" (UniqueName: \"kubernetes.io/projected/0a1de9f1-c47b-46ec-9549-b643ddf4ecdd-kube-api-access-zsclq\") pod \"0a1de9f1-c47b-46ec-9549-b643ddf4ecdd\" (UID: \"0a1de9f1-c47b-46ec-9549-b643ddf4ecdd\") " Nov 25 15:45:03 crc kubenswrapper[4704]: I1125 15:45:03.207384 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0a1de9f1-c47b-46ec-9549-b643ddf4ecdd-secret-volume\") pod \"0a1de9f1-c47b-46ec-9549-b643ddf4ecdd\" (UID: \"0a1de9f1-c47b-46ec-9549-b643ddf4ecdd\") " Nov 25 15:45:03 crc kubenswrapper[4704]: I1125 15:45:03.208231 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a1de9f1-c47b-46ec-9549-b643ddf4ecdd-config-volume" (OuterVolumeSpecName: "config-volume") pod "0a1de9f1-c47b-46ec-9549-b643ddf4ecdd" (UID: "0a1de9f1-c47b-46ec-9549-b643ddf4ecdd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:45:03 crc kubenswrapper[4704]: I1125 15:45:03.213596 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a1de9f1-c47b-46ec-9549-b643ddf4ecdd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0a1de9f1-c47b-46ec-9549-b643ddf4ecdd" (UID: "0a1de9f1-c47b-46ec-9549-b643ddf4ecdd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:45:03 crc kubenswrapper[4704]: I1125 15:45:03.215185 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a1de9f1-c47b-46ec-9549-b643ddf4ecdd-kube-api-access-zsclq" (OuterVolumeSpecName: "kube-api-access-zsclq") pod "0a1de9f1-c47b-46ec-9549-b643ddf4ecdd" (UID: "0a1de9f1-c47b-46ec-9549-b643ddf4ecdd"). InnerVolumeSpecName "kube-api-access-zsclq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:45:03 crc kubenswrapper[4704]: I1125 15:45:03.309116 4704 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a1de9f1-c47b-46ec-9549-b643ddf4ecdd-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 15:45:03 crc kubenswrapper[4704]: I1125 15:45:03.309178 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsclq\" (UniqueName: \"kubernetes.io/projected/0a1de9f1-c47b-46ec-9549-b643ddf4ecdd-kube-api-access-zsclq\") on node \"crc\" DevicePath \"\"" Nov 25 15:45:03 crc kubenswrapper[4704]: I1125 15:45:03.309194 4704 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0a1de9f1-c47b-46ec-9549-b643ddf4ecdd-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 25 15:45:03 crc kubenswrapper[4704]: I1125 15:45:03.845004 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-lgcz2" event={"ID":"0a1de9f1-c47b-46ec-9549-b643ddf4ecdd","Type":"ContainerDied","Data":"a5868aa4c6dc1d70c2bc991faebf2e4db4c29193c9071021412db2a16fa10551"} Nov 25 15:45:03 crc kubenswrapper[4704]: I1125 15:45:03.845045 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5868aa4c6dc1d70c2bc991faebf2e4db4c29193c9071021412db2a16fa10551" Nov 25 15:45:03 crc kubenswrapper[4704]: I1125 15:45:03.845072 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401425-lgcz2" Nov 25 15:45:07 crc kubenswrapper[4704]: I1125 15:45:07.964345 4704 patch_prober.go:28] interesting pod/machine-config-daemon-djz8x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:45:07 crc kubenswrapper[4704]: I1125 15:45:07.964811 4704 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:45:14 crc kubenswrapper[4704]: I1125 15:45:14.647387 4704 scope.go:117] "RemoveContainer" containerID="ba09015ba8c7d8a74a422af49cd1b9b24702456343aa7130512a223e529f593a" Nov 25 15:45:14 crc kubenswrapper[4704]: I1125 15:45:14.669432 4704 scope.go:117] "RemoveContainer" containerID="598c71074243848618353669294d7ff850808fdebaac98b749b8d7090caaa4c3" Nov 25 15:45:37 crc kubenswrapper[4704]: I1125 15:45:37.964881 4704 patch_prober.go:28] interesting pod/machine-config-daemon-djz8x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:45:37 crc kubenswrapper[4704]: I1125 15:45:37.966155 4704 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:46:07 crc kubenswrapper[4704]: I1125 15:46:07.964560 4704 patch_prober.go:28] interesting pod/machine-config-daemon-djz8x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:46:07 crc kubenswrapper[4704]: I1125 15:46:07.965430 4704 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:46:07 crc kubenswrapper[4704]: I1125 15:46:07.965482 4704 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" Nov 25 15:46:07 crc kubenswrapper[4704]: I1125 15:46:07.966017 4704 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"491da08c31f2f2cd2745fd9b52997ec5a66034a8d558b6b85cbfececf99b972a"} pod="openshift-machine-config-operator/machine-config-daemon-djz8x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 15:46:07 crc kubenswrapper[4704]: I1125 15:46:07.966072 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" containerID="cri-o://491da08c31f2f2cd2745fd9b52997ec5a66034a8d558b6b85cbfececf99b972a" gracePeriod=600 Nov 25 15:46:08 crc kubenswrapper[4704]: I1125 15:46:08.137326 4704 generic.go:334] "Generic (PLEG): container finished" podID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerID="491da08c31f2f2cd2745fd9b52997ec5a66034a8d558b6b85cbfececf99b972a" exitCode=0 Nov 25 15:46:08 crc kubenswrapper[4704]: I1125 15:46:08.137403 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" event={"ID":"91b52682-d008-4b8a-8bc3-26b032d7dc2c","Type":"ContainerDied","Data":"491da08c31f2f2cd2745fd9b52997ec5a66034a8d558b6b85cbfececf99b972a"} Nov 25 15:46:08 crc kubenswrapper[4704]: I1125 15:46:08.137479 4704 scope.go:117] "RemoveContainer" containerID="bf646e20b03b3390aa256db5e03bee5d833cba5b9a37144d98eae89a8816d8d1" Nov 25 15:46:09 crc kubenswrapper[4704]: I1125 15:46:09.145676 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" event={"ID":"91b52682-d008-4b8a-8bc3-26b032d7dc2c","Type":"ContainerStarted","Data":"70c340a5598fd3ac0fcb6b9ef0ce0145e436d285d89d93a8b40ff742af895c50"} Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.589460 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5kt46"] Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.591417 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="ovn-controller" containerID="cri-o://1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a" gracePeriod=30 Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.591938 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7" gracePeriod=30 Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.591971 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="ovn-acl-logging" containerID="cri-o://7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a" gracePeriod=30 Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.591951 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="kube-rbac-proxy-node" containerID="cri-o://9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc" gracePeriod=30 Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.592065 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="sbdb" containerID="cri-o://717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4" gracePeriod=30 Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.592158 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="nbdb" containerID="cri-o://b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2" gracePeriod=30 Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.592184 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="northd" containerID="cri-o://d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10" gracePeriod=30 Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.626359 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="ovnkube-controller" containerID="cri-o://1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082" gracePeriod=30 Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.928691 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5kt46_f5274608-0c76-48d9-949d-53254df99b83/ovnkube-controller/3.log" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.932289 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5kt46_f5274608-0c76-48d9-949d-53254df99b83/ovn-acl-logging/0.log" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.932859 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5kt46_f5274608-0c76-48d9-949d-53254df99b83/ovn-controller/0.log" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.933438 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.993484 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sc8dr"] Nov 25 15:46:39 crc kubenswrapper[4704]: E1125 15:46:39.993758 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="ovnkube-controller" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.993777 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="ovnkube-controller" Nov 25 15:46:39 crc kubenswrapper[4704]: E1125 15:46:39.993816 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="ovnkube-controller" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.993825 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="ovnkube-controller" Nov 25 15:46:39 crc kubenswrapper[4704]: E1125 15:46:39.993833 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="ovn-controller" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.993841 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="ovn-controller" Nov 25 15:46:39 crc kubenswrapper[4704]: E1125 15:46:39.993852 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="ovn-acl-logging" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.993859 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="ovn-acl-logging" Nov 25 15:46:39 crc kubenswrapper[4704]: E1125 15:46:39.993872 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="ovnkube-controller" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.993879 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="ovnkube-controller" Nov 25 15:46:39 crc kubenswrapper[4704]: E1125 15:46:39.993888 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="kubecfg-setup" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.993895 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="kubecfg-setup" Nov 25 15:46:39 crc kubenswrapper[4704]: E1125 15:46:39.993904 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="nbdb" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.993912 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="nbdb" Nov 25 15:46:39 crc kubenswrapper[4704]: E1125 15:46:39.993921 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="ovnkube-controller" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.993928 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="ovnkube-controller" Nov 25 15:46:39 crc kubenswrapper[4704]: E1125 15:46:39.993938 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="kube-rbac-proxy-ovn-metrics" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.993946 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="kube-rbac-proxy-ovn-metrics" Nov 25 15:46:39 crc kubenswrapper[4704]: E1125 15:46:39.993960 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="kube-rbac-proxy-node" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.993972 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="kube-rbac-proxy-node" Nov 25 15:46:39 crc kubenswrapper[4704]: E1125 15:46:39.993988 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="northd" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.993996 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="northd" Nov 25 15:46:39 crc kubenswrapper[4704]: E1125 15:46:39.994007 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a1de9f1-c47b-46ec-9549-b643ddf4ecdd" containerName="collect-profiles" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.994015 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a1de9f1-c47b-46ec-9549-b643ddf4ecdd" containerName="collect-profiles" Nov 25 15:46:39 crc kubenswrapper[4704]: E1125 15:46:39.994027 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="sbdb" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.994034 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="sbdb" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.994153 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="ovnkube-controller" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.994168 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="ovnkube-controller" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.994175 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="sbdb" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.994183 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a1de9f1-c47b-46ec-9549-b643ddf4ecdd" containerName="collect-profiles" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.994190 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="ovn-controller" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.994201 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="nbdb" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.994210 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="ovnkube-controller" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.994219 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="kube-rbac-proxy-ovn-metrics" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.994226 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="northd" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.994232 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="ovn-acl-logging" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.994240 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="kube-rbac-proxy-node" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.994249 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="ovnkube-controller" Nov 25 15:46:39 crc kubenswrapper[4704]: E1125 15:46:39.994347 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="ovnkube-controller" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.994357 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="ovnkube-controller" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.994481 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5274608-0c76-48d9-949d-53254df99b83" containerName="ovnkube-controller" Nov 25 15:46:39 crc kubenswrapper[4704]: I1125 15:46:39.996119 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.102133 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-run-ovn-kubernetes\") pod \"f5274608-0c76-48d9-949d-53254df99b83\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.102200 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f5274608-0c76-48d9-949d-53254df99b83-ovnkube-script-lib\") pod \"f5274608-0c76-48d9-949d-53254df99b83\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.102225 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-etc-openvswitch\") pod \"f5274608-0c76-48d9-949d-53254df99b83\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.102247 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f5274608-0c76-48d9-949d-53254df99b83-ovnkube-config\") pod \"f5274608-0c76-48d9-949d-53254df99b83\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.102269 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-cni-bin\") pod \"f5274608-0c76-48d9-949d-53254df99b83\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.102292 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-node-log\") pod \"f5274608-0c76-48d9-949d-53254df99b83\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.102322 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-run-ovn\") pod \"f5274608-0c76-48d9-949d-53254df99b83\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.102377 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-var-lib-openvswitch\") pod \"f5274608-0c76-48d9-949d-53254df99b83\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.102396 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-slash\") pod \"f5274608-0c76-48d9-949d-53254df99b83\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.102423 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f5274608-0c76-48d9-949d-53254df99b83-ovn-node-metrics-cert\") pod \"f5274608-0c76-48d9-949d-53254df99b83\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.102456 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f5274608-0c76-48d9-949d-53254df99b83-env-overrides\") pod \"f5274608-0c76-48d9-949d-53254df99b83\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.102511 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-cni-netd\") pod \"f5274608-0c76-48d9-949d-53254df99b83\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.102535 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-systemd-units\") pod \"f5274608-0c76-48d9-949d-53254df99b83\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.102558 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-log-socket\") pod \"f5274608-0c76-48d9-949d-53254df99b83\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.102580 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-run-openvswitch\") pod \"f5274608-0c76-48d9-949d-53254df99b83\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.102605 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-run-systemd\") pod \"f5274608-0c76-48d9-949d-53254df99b83\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.102629 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-run-netns\") pod \"f5274608-0c76-48d9-949d-53254df99b83\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.102653 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-var-lib-cni-networks-ovn-kubernetes\") pod \"f5274608-0c76-48d9-949d-53254df99b83\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.102678 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cw5d9\" (UniqueName: \"kubernetes.io/projected/f5274608-0c76-48d9-949d-53254df99b83-kube-api-access-cw5d9\") pod \"f5274608-0c76-48d9-949d-53254df99b83\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.102696 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-kubelet\") pod \"f5274608-0c76-48d9-949d-53254df99b83\" (UID: \"f5274608-0c76-48d9-949d-53254df99b83\") " Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.102871 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-host-kubelet\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.102894 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-host-run-ovn-kubernetes\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.102910 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/940d9ea0-d9cc-4cf9-9211-34ca72879d09-ovnkube-script-lib\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.102927 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.102942 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-systemd-units\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.102974 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/940d9ea0-d9cc-4cf9-9211-34ca72879d09-env-overrides\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.102991 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-host-cni-bin\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.103006 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-host-slash\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.103024 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-run-systemd\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.103052 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-var-lib-openvswitch\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.103075 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-host-cni-netd\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.103111 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwcmz\" (UniqueName: \"kubernetes.io/projected/940d9ea0-d9cc-4cf9-9211-34ca72879d09-kube-api-access-gwcmz\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.103133 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-log-socket\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.103154 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-etc-openvswitch\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.103179 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-host-run-netns\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.103200 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/940d9ea0-d9cc-4cf9-9211-34ca72879d09-ovnkube-config\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.103219 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-node-log\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.103235 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/940d9ea0-d9cc-4cf9-9211-34ca72879d09-ovn-node-metrics-cert\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.103257 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-run-openvswitch\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.103282 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-run-ovn\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.103407 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "f5274608-0c76-48d9-949d-53254df99b83" (UID: "f5274608-0c76-48d9-949d-53254df99b83"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.103965 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "f5274608-0c76-48d9-949d-53254df99b83" (UID: "f5274608-0c76-48d9-949d-53254df99b83"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.104121 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5274608-0c76-48d9-949d-53254df99b83-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "f5274608-0c76-48d9-949d-53254df99b83" (UID: "f5274608-0c76-48d9-949d-53254df99b83"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.104462 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5274608-0c76-48d9-949d-53254df99b83-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "f5274608-0c76-48d9-949d-53254df99b83" (UID: "f5274608-0c76-48d9-949d-53254df99b83"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.104493 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "f5274608-0c76-48d9-949d-53254df99b83" (UID: "f5274608-0c76-48d9-949d-53254df99b83"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.104513 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-node-log" (OuterVolumeSpecName: "node-log") pod "f5274608-0c76-48d9-949d-53254df99b83" (UID: "f5274608-0c76-48d9-949d-53254df99b83"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.104530 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "f5274608-0c76-48d9-949d-53254df99b83" (UID: "f5274608-0c76-48d9-949d-53254df99b83"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.104552 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "f5274608-0c76-48d9-949d-53254df99b83" (UID: "f5274608-0c76-48d9-949d-53254df99b83"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.104570 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-slash" (OuterVolumeSpecName: "host-slash") pod "f5274608-0c76-48d9-949d-53254df99b83" (UID: "f5274608-0c76-48d9-949d-53254df99b83"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.104698 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "f5274608-0c76-48d9-949d-53254df99b83" (UID: "f5274608-0c76-48d9-949d-53254df99b83"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.104732 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-log-socket" (OuterVolumeSpecName: "log-socket") pod "f5274608-0c76-48d9-949d-53254df99b83" (UID: "f5274608-0c76-48d9-949d-53254df99b83"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.104754 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "f5274608-0c76-48d9-949d-53254df99b83" (UID: "f5274608-0c76-48d9-949d-53254df99b83"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.104778 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "f5274608-0c76-48d9-949d-53254df99b83" (UID: "f5274608-0c76-48d9-949d-53254df99b83"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.105079 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "f5274608-0c76-48d9-949d-53254df99b83" (UID: "f5274608-0c76-48d9-949d-53254df99b83"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.105118 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5274608-0c76-48d9-949d-53254df99b83-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "f5274608-0c76-48d9-949d-53254df99b83" (UID: "f5274608-0c76-48d9-949d-53254df99b83"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.105148 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "f5274608-0c76-48d9-949d-53254df99b83" (UID: "f5274608-0c76-48d9-949d-53254df99b83"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.105609 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "f5274608-0c76-48d9-949d-53254df99b83" (UID: "f5274608-0c76-48d9-949d-53254df99b83"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.112628 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5274608-0c76-48d9-949d-53254df99b83-kube-api-access-cw5d9" (OuterVolumeSpecName: "kube-api-access-cw5d9") pod "f5274608-0c76-48d9-949d-53254df99b83" (UID: "f5274608-0c76-48d9-949d-53254df99b83"). InnerVolumeSpecName "kube-api-access-cw5d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.112866 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5274608-0c76-48d9-949d-53254df99b83-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "f5274608-0c76-48d9-949d-53254df99b83" (UID: "f5274608-0c76-48d9-949d-53254df99b83"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.125178 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "f5274608-0c76-48d9-949d-53254df99b83" (UID: "f5274608-0c76-48d9-949d-53254df99b83"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.204218 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-host-cni-netd\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.204405 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwcmz\" (UniqueName: \"kubernetes.io/projected/940d9ea0-d9cc-4cf9-9211-34ca72879d09-kube-api-access-gwcmz\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.204866 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-log-socket\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.204312 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-host-cni-netd\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.204952 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-etc-openvswitch\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.205048 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-host-run-netns\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.205070 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-etc-openvswitch\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.205107 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/940d9ea0-d9cc-4cf9-9211-34ca72879d09-ovnkube-config\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.205145 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-host-run-netns\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.205178 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-node-log\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.205213 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/940d9ea0-d9cc-4cf9-9211-34ca72879d09-ovn-node-metrics-cert\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.205003 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-log-socket\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.205251 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-run-openvswitch\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.205211 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-node-log\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.205233 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-run-openvswitch\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.205417 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-run-ovn\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.205487 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-run-ovn\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.205514 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-host-kubelet\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.205558 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-host-run-ovn-kubernetes\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.205588 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-host-kubelet\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.205594 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/940d9ea0-d9cc-4cf9-9211-34ca72879d09-ovnkube-script-lib\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.205625 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.205654 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-systemd-units\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.205679 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/940d9ea0-d9cc-4cf9-9211-34ca72879d09-env-overrides\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.205626 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-host-run-ovn-kubernetes\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.205679 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.205731 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-host-cni-bin\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.205780 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-host-slash\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.205829 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-run-systemd\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.205835 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-systemd-units\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.205781 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-host-cni-bin\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.205900 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-run-systemd\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.205959 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-var-lib-openvswitch\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.205968 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-host-slash\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.206069 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cw5d9\" (UniqueName: \"kubernetes.io/projected/f5274608-0c76-48d9-949d-53254df99b83-kube-api-access-cw5d9\") on node \"crc\" DevicePath \"\"" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.206088 4704 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-kubelet\") on node \"crc\" DevicePath \"\"" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.206105 4704 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.206117 4704 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f5274608-0c76-48d9-949d-53254df99b83-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.206129 4704 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.206144 4704 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f5274608-0c76-48d9-949d-53254df99b83-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.206158 4704 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-cni-bin\") on node \"crc\" DevicePath \"\"" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.206168 4704 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-node-log\") on node \"crc\" DevicePath \"\"" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.206187 4704 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.206177 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/940d9ea0-d9cc-4cf9-9211-34ca72879d09-var-lib-openvswitch\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.206239 4704 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.206257 4704 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-slash\") on node \"crc\" DevicePath \"\"" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.206271 4704 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f5274608-0c76-48d9-949d-53254df99b83-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.206324 4704 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f5274608-0c76-48d9-949d-53254df99b83-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.206336 4704 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-cni-netd\") on node \"crc\" DevicePath \"\"" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.206347 4704 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-systemd-units\") on node \"crc\" DevicePath \"\"" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.206358 4704 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-log-socket\") on node \"crc\" DevicePath \"\"" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.206371 4704 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-run-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.206382 4704 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-run-systemd\") on node \"crc\" DevicePath \"\"" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.206395 4704 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-run-netns\") on node \"crc\" DevicePath \"\"" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.206396 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/940d9ea0-d9cc-4cf9-9211-34ca72879d09-env-overrides\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.206409 4704 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f5274608-0c76-48d9-949d-53254df99b83-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.206535 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/940d9ea0-d9cc-4cf9-9211-34ca72879d09-ovnkube-script-lib\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.207025 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/940d9ea0-d9cc-4cf9-9211-34ca72879d09-ovnkube-config\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.208634 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/940d9ea0-d9cc-4cf9-9211-34ca72879d09-ovn-node-metrics-cert\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.221374 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwcmz\" (UniqueName: \"kubernetes.io/projected/940d9ea0-d9cc-4cf9-9211-34ca72879d09-kube-api-access-gwcmz\") pod \"ovnkube-node-sc8dr\" (UID: \"940d9ea0-d9cc-4cf9-9211-34ca72879d09\") " pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.308757 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h92xm_d2820ade-e9bd-4146-b275-0c3b7d0cb5aa/kube-multus/2.log" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.309462 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h92xm_d2820ade-e9bd-4146-b275-0c3b7d0cb5aa/kube-multus/1.log" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.309523 4704 generic.go:334] "Generic (PLEG): container finished" podID="d2820ade-e9bd-4146-b275-0c3b7d0cb5aa" containerID="6ebde3bce3ebb98df82c1e2217d50256663339143d5a82ad4958eeed412b4c81" exitCode=2 Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.309638 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h92xm" event={"ID":"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa","Type":"ContainerDied","Data":"6ebde3bce3ebb98df82c1e2217d50256663339143d5a82ad4958eeed412b4c81"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.309725 4704 scope.go:117] "RemoveContainer" containerID="89cb23cc625602134c0e14c76fa545386707fdb2815ffbc0563737edc6552ff0" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.310155 4704 scope.go:117] "RemoveContainer" containerID="6ebde3bce3ebb98df82c1e2217d50256663339143d5a82ad4958eeed412b4c81" Nov 25 15:46:40 crc kubenswrapper[4704]: E1125 15:46:40.310539 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-h92xm_openshift-multus(d2820ade-e9bd-4146-b275-0c3b7d0cb5aa)\"" pod="openshift-multus/multus-h92xm" podUID="d2820ade-e9bd-4146-b275-0c3b7d0cb5aa" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.311842 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.313653 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5kt46_f5274608-0c76-48d9-949d-53254df99b83/ovnkube-controller/3.log" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.323052 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5kt46_f5274608-0c76-48d9-949d-53254df99b83/ovn-acl-logging/0.log" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.324447 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5kt46_f5274608-0c76-48d9-949d-53254df99b83/ovn-controller/0.log" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.326541 4704 generic.go:334] "Generic (PLEG): container finished" podID="f5274608-0c76-48d9-949d-53254df99b83" containerID="1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082" exitCode=0 Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.326605 4704 generic.go:334] "Generic (PLEG): container finished" podID="f5274608-0c76-48d9-949d-53254df99b83" containerID="717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4" exitCode=0 Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.326627 4704 generic.go:334] "Generic (PLEG): container finished" podID="f5274608-0c76-48d9-949d-53254df99b83" containerID="b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2" exitCode=0 Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.326645 4704 generic.go:334] "Generic (PLEG): container finished" podID="f5274608-0c76-48d9-949d-53254df99b83" containerID="d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10" exitCode=0 Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.326662 4704 generic.go:334] "Generic (PLEG): container finished" podID="f5274608-0c76-48d9-949d-53254df99b83" containerID="10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7" exitCode=0 Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.326678 4704 generic.go:334] "Generic (PLEG): container finished" podID="f5274608-0c76-48d9-949d-53254df99b83" containerID="9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc" exitCode=0 Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.326695 4704 generic.go:334] "Generic (PLEG): container finished" podID="f5274608-0c76-48d9-949d-53254df99b83" containerID="7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a" exitCode=143 Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.326718 4704 generic.go:334] "Generic (PLEG): container finished" podID="f5274608-0c76-48d9-949d-53254df99b83" containerID="1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a" exitCode=143 Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.326765 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" event={"ID":"f5274608-0c76-48d9-949d-53254df99b83","Type":"ContainerDied","Data":"1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.326852 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" event={"ID":"f5274608-0c76-48d9-949d-53254df99b83","Type":"ContainerDied","Data":"717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.326884 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" event={"ID":"f5274608-0c76-48d9-949d-53254df99b83","Type":"ContainerDied","Data":"b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.326909 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" event={"ID":"f5274608-0c76-48d9-949d-53254df99b83","Type":"ContainerDied","Data":"d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.326936 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" event={"ID":"f5274608-0c76-48d9-949d-53254df99b83","Type":"ContainerDied","Data":"10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.326960 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" event={"ID":"f5274608-0c76-48d9-949d-53254df99b83","Type":"ContainerDied","Data":"9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.326992 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327020 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327036 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327050 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327066 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327080 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327094 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327109 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327123 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327139 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327160 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" event={"ID":"f5274608-0c76-48d9-949d-53254df99b83","Type":"ContainerDied","Data":"7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327181 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327198 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327212 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327227 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327242 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327257 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327272 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327286 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327301 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327315 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327336 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" event={"ID":"f5274608-0c76-48d9-949d-53254df99b83","Type":"ContainerDied","Data":"1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327359 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327376 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327391 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327414 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327429 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327444 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327459 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327474 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327488 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327508 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327527 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" event={"ID":"f5274608-0c76-48d9-949d-53254df99b83","Type":"ContainerDied","Data":"86a6e2b98ed5c37e9f0fbaecb72bca014099df11c423004fd1ad2fc6d2720538"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327550 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327570 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327586 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327602 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327616 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327631 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327645 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327658 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327671 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.327684 4704 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58"} Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.328077 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5kt46" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.355865 4704 scope.go:117] "RemoveContainer" containerID="1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.386268 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5kt46"] Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.390348 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5kt46"] Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.393061 4704 scope.go:117] "RemoveContainer" containerID="c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.415324 4704 scope.go:117] "RemoveContainer" containerID="717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.424296 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5274608-0c76-48d9-949d-53254df99b83" path="/var/lib/kubelet/pods/f5274608-0c76-48d9-949d-53254df99b83/volumes" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.429808 4704 scope.go:117] "RemoveContainer" containerID="b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.447032 4704 scope.go:117] "RemoveContainer" containerID="d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.462757 4704 scope.go:117] "RemoveContainer" containerID="10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.480484 4704 scope.go:117] "RemoveContainer" containerID="9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.585861 4704 scope.go:117] "RemoveContainer" containerID="7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.603528 4704 scope.go:117] "RemoveContainer" containerID="1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.620421 4704 scope.go:117] "RemoveContainer" containerID="58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.641581 4704 scope.go:117] "RemoveContainer" containerID="1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082" Nov 25 15:46:40 crc kubenswrapper[4704]: E1125 15:46:40.642838 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082\": container with ID starting with 1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082 not found: ID does not exist" containerID="1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.642875 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082"} err="failed to get container status \"1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082\": rpc error: code = NotFound desc = could not find container \"1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082\": container with ID starting with 1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082 not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.642900 4704 scope.go:117] "RemoveContainer" containerID="c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606" Nov 25 15:46:40 crc kubenswrapper[4704]: E1125 15:46:40.643320 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606\": container with ID starting with c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606 not found: ID does not exist" containerID="c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.643346 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606"} err="failed to get container status \"c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606\": rpc error: code = NotFound desc = could not find container \"c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606\": container with ID starting with c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606 not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.643361 4704 scope.go:117] "RemoveContainer" containerID="717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4" Nov 25 15:46:40 crc kubenswrapper[4704]: E1125 15:46:40.643870 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\": container with ID starting with 717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4 not found: ID does not exist" containerID="717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.643955 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4"} err="failed to get container status \"717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\": rpc error: code = NotFound desc = could not find container \"717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\": container with ID starting with 717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4 not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.644012 4704 scope.go:117] "RemoveContainer" containerID="b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2" Nov 25 15:46:40 crc kubenswrapper[4704]: E1125 15:46:40.644869 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\": container with ID starting with b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2 not found: ID does not exist" containerID="b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.644914 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2"} err="failed to get container status \"b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\": rpc error: code = NotFound desc = could not find container \"b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\": container with ID starting with b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2 not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.644936 4704 scope.go:117] "RemoveContainer" containerID="d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10" Nov 25 15:46:40 crc kubenswrapper[4704]: E1125 15:46:40.645460 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\": container with ID starting with d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10 not found: ID does not exist" containerID="d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.645516 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10"} err="failed to get container status \"d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\": rpc error: code = NotFound desc = could not find container \"d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\": container with ID starting with d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10 not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.645566 4704 scope.go:117] "RemoveContainer" containerID="10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7" Nov 25 15:46:40 crc kubenswrapper[4704]: E1125 15:46:40.646021 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\": container with ID starting with 10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7 not found: ID does not exist" containerID="10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.646055 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7"} err="failed to get container status \"10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\": rpc error: code = NotFound desc = could not find container \"10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\": container with ID starting with 10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7 not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.646077 4704 scope.go:117] "RemoveContainer" containerID="9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc" Nov 25 15:46:40 crc kubenswrapper[4704]: E1125 15:46:40.646488 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\": container with ID starting with 9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc not found: ID does not exist" containerID="9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.646524 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc"} err="failed to get container status \"9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\": rpc error: code = NotFound desc = could not find container \"9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\": container with ID starting with 9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.646542 4704 scope.go:117] "RemoveContainer" containerID="7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a" Nov 25 15:46:40 crc kubenswrapper[4704]: E1125 15:46:40.646872 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\": container with ID starting with 7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a not found: ID does not exist" containerID="7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.646915 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a"} err="failed to get container status \"7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\": rpc error: code = NotFound desc = could not find container \"7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\": container with ID starting with 7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.646937 4704 scope.go:117] "RemoveContainer" containerID="1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a" Nov 25 15:46:40 crc kubenswrapper[4704]: E1125 15:46:40.647506 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\": container with ID starting with 1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a not found: ID does not exist" containerID="1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.647563 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a"} err="failed to get container status \"1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\": rpc error: code = NotFound desc = could not find container \"1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\": container with ID starting with 1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.647596 4704 scope.go:117] "RemoveContainer" containerID="58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58" Nov 25 15:46:40 crc kubenswrapper[4704]: E1125 15:46:40.648096 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\": container with ID starting with 58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58 not found: ID does not exist" containerID="58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.648134 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58"} err="failed to get container status \"58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\": rpc error: code = NotFound desc = could not find container \"58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\": container with ID starting with 58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58 not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.648158 4704 scope.go:117] "RemoveContainer" containerID="1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.648503 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082"} err="failed to get container status \"1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082\": rpc error: code = NotFound desc = could not find container \"1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082\": container with ID starting with 1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082 not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.648533 4704 scope.go:117] "RemoveContainer" containerID="c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.648881 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606"} err="failed to get container status \"c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606\": rpc error: code = NotFound desc = could not find container \"c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606\": container with ID starting with c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606 not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.648924 4704 scope.go:117] "RemoveContainer" containerID="717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.649230 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4"} err="failed to get container status \"717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\": rpc error: code = NotFound desc = could not find container \"717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\": container with ID starting with 717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4 not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.649260 4704 scope.go:117] "RemoveContainer" containerID="b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.649574 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2"} err="failed to get container status \"b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\": rpc error: code = NotFound desc = could not find container \"b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\": container with ID starting with b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2 not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.649608 4704 scope.go:117] "RemoveContainer" containerID="d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.650325 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10"} err="failed to get container status \"d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\": rpc error: code = NotFound desc = could not find container \"d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\": container with ID starting with d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10 not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.650369 4704 scope.go:117] "RemoveContainer" containerID="10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.650769 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7"} err="failed to get container status \"10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\": rpc error: code = NotFound desc = could not find container \"10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\": container with ID starting with 10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7 not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.650807 4704 scope.go:117] "RemoveContainer" containerID="9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.651276 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc"} err="failed to get container status \"9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\": rpc error: code = NotFound desc = could not find container \"9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\": container with ID starting with 9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.651313 4704 scope.go:117] "RemoveContainer" containerID="7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.651642 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a"} err="failed to get container status \"7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\": rpc error: code = NotFound desc = could not find container \"7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\": container with ID starting with 7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.651664 4704 scope.go:117] "RemoveContainer" containerID="1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.652126 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a"} err="failed to get container status \"1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\": rpc error: code = NotFound desc = could not find container \"1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\": container with ID starting with 1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.652170 4704 scope.go:117] "RemoveContainer" containerID="58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.652550 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58"} err="failed to get container status \"58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\": rpc error: code = NotFound desc = could not find container \"58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\": container with ID starting with 58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58 not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.652573 4704 scope.go:117] "RemoveContainer" containerID="1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.653045 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082"} err="failed to get container status \"1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082\": rpc error: code = NotFound desc = could not find container \"1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082\": container with ID starting with 1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082 not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.653075 4704 scope.go:117] "RemoveContainer" containerID="c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.653541 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606"} err="failed to get container status \"c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606\": rpc error: code = NotFound desc = could not find container \"c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606\": container with ID starting with c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606 not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.653572 4704 scope.go:117] "RemoveContainer" containerID="717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.654212 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4"} err="failed to get container status \"717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\": rpc error: code = NotFound desc = could not find container \"717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\": container with ID starting with 717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4 not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.654245 4704 scope.go:117] "RemoveContainer" containerID="b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.654685 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2"} err="failed to get container status \"b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\": rpc error: code = NotFound desc = could not find container \"b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\": container with ID starting with b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2 not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.654708 4704 scope.go:117] "RemoveContainer" containerID="d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.655152 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10"} err="failed to get container status \"d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\": rpc error: code = NotFound desc = could not find container \"d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\": container with ID starting with d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10 not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.655174 4704 scope.go:117] "RemoveContainer" containerID="10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.655489 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7"} err="failed to get container status \"10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\": rpc error: code = NotFound desc = could not find container \"10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\": container with ID starting with 10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7 not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.655511 4704 scope.go:117] "RemoveContainer" containerID="9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.655766 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc"} err="failed to get container status \"9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\": rpc error: code = NotFound desc = could not find container \"9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\": container with ID starting with 9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.655818 4704 scope.go:117] "RemoveContainer" containerID="7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.656126 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a"} err="failed to get container status \"7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\": rpc error: code = NotFound desc = could not find container \"7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\": container with ID starting with 7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.656149 4704 scope.go:117] "RemoveContainer" containerID="1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.656590 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a"} err="failed to get container status \"1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\": rpc error: code = NotFound desc = could not find container \"1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\": container with ID starting with 1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.656613 4704 scope.go:117] "RemoveContainer" containerID="58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.656919 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58"} err="failed to get container status \"58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\": rpc error: code = NotFound desc = could not find container \"58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\": container with ID starting with 58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58 not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.656946 4704 scope.go:117] "RemoveContainer" containerID="1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.657299 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082"} err="failed to get container status \"1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082\": rpc error: code = NotFound desc = could not find container \"1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082\": container with ID starting with 1ef5c18b2d2e8c959d0dcdf38e7902387ffce8cefab68b9a6cada882d5184082 not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.657325 4704 scope.go:117] "RemoveContainer" containerID="c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.657567 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606"} err="failed to get container status \"c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606\": rpc error: code = NotFound desc = could not find container \"c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606\": container with ID starting with c9980cbccfeaec5a5f5e995970341eead2247b1cfd88fbebb59dc9e86f1f0606 not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.657609 4704 scope.go:117] "RemoveContainer" containerID="717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.658060 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4"} err="failed to get container status \"717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\": rpc error: code = NotFound desc = could not find container \"717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4\": container with ID starting with 717a6a0791d4a022debdb3551590dd0ee0760834deda169679d800eedfe502a4 not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.658085 4704 scope.go:117] "RemoveContainer" containerID="b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.658382 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2"} err="failed to get container status \"b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\": rpc error: code = NotFound desc = could not find container \"b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2\": container with ID starting with b5c367b9c5ea75e7423d663b398ac9bccc4b9bd5375f749db01fdd4e4c138cb2 not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.658419 4704 scope.go:117] "RemoveContainer" containerID="d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.658807 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10"} err="failed to get container status \"d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\": rpc error: code = NotFound desc = could not find container \"d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10\": container with ID starting with d176c7f0c3e5351a6fcb0a4d8a4b06f20abf9b3443607b1781af62d5d1dffd10 not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.658832 4704 scope.go:117] "RemoveContainer" containerID="10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.659080 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7"} err="failed to get container status \"10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\": rpc error: code = NotFound desc = could not find container \"10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7\": container with ID starting with 10efddfbd9dea600f1126828203363952acbceb3c9dece37e439cc81d88e04d7 not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.659100 4704 scope.go:117] "RemoveContainer" containerID="9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.659365 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc"} err="failed to get container status \"9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\": rpc error: code = NotFound desc = could not find container \"9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc\": container with ID starting with 9fbd065aa30376a9de65956a0eff24da553b6044c7d6309fec335bb3ffd004bc not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.659389 4704 scope.go:117] "RemoveContainer" containerID="7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.659634 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a"} err="failed to get container status \"7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\": rpc error: code = NotFound desc = could not find container \"7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a\": container with ID starting with 7d86b5493313ef36f22ebb5338c8a929b54551a6e6beb2b88c571c087a764b9a not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.659656 4704 scope.go:117] "RemoveContainer" containerID="1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.659979 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a"} err="failed to get container status \"1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\": rpc error: code = NotFound desc = could not find container \"1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a\": container with ID starting with 1bb9ded1a637290eb0cca1f11c7c533d220358f9572ec0d8a36c9ece0f6f846a not found: ID does not exist" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.660001 4704 scope.go:117] "RemoveContainer" containerID="58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58" Nov 25 15:46:40 crc kubenswrapper[4704]: I1125 15:46:40.660233 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58"} err="failed to get container status \"58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\": rpc error: code = NotFound desc = could not find container \"58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58\": container with ID starting with 58b8550120fc356a2b481c3c00c4366eb6bc127426d2ed4bfc338b0561f6be58 not found: ID does not exist" Nov 25 15:46:41 crc kubenswrapper[4704]: I1125 15:46:41.334374 4704 generic.go:334] "Generic (PLEG): container finished" podID="940d9ea0-d9cc-4cf9-9211-34ca72879d09" containerID="f2117d750bbe3df00711bfd92a9d1d9366d88d1ab6f69709041e8ce7965adf99" exitCode=0 Nov 25 15:46:41 crc kubenswrapper[4704]: I1125 15:46:41.334451 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" event={"ID":"940d9ea0-d9cc-4cf9-9211-34ca72879d09","Type":"ContainerDied","Data":"f2117d750bbe3df00711bfd92a9d1d9366d88d1ab6f69709041e8ce7965adf99"} Nov 25 15:46:41 crc kubenswrapper[4704]: I1125 15:46:41.334486 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" event={"ID":"940d9ea0-d9cc-4cf9-9211-34ca72879d09","Type":"ContainerStarted","Data":"897419da2d120e4503ef0685e62d5e120b2a737ae74c57cadeca832bdc9c5405"} Nov 25 15:46:41 crc kubenswrapper[4704]: I1125 15:46:41.337052 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h92xm_d2820ade-e9bd-4146-b275-0c3b7d0cb5aa/kube-multus/2.log" Nov 25 15:46:42 crc kubenswrapper[4704]: I1125 15:46:42.348751 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" event={"ID":"940d9ea0-d9cc-4cf9-9211-34ca72879d09","Type":"ContainerStarted","Data":"c8a9fa8c1138c1c510269bf55a8d97da208ab918f129ad772640e4896aff1a28"} Nov 25 15:46:42 crc kubenswrapper[4704]: I1125 15:46:42.349390 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" event={"ID":"940d9ea0-d9cc-4cf9-9211-34ca72879d09","Type":"ContainerStarted","Data":"03032eda276e86a4677436134e81c90b066a2f94677da0f64f96d3fc57bf545f"} Nov 25 15:46:42 crc kubenswrapper[4704]: I1125 15:46:42.349415 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" event={"ID":"940d9ea0-d9cc-4cf9-9211-34ca72879d09","Type":"ContainerStarted","Data":"8f4412d04e147668b2fbed5e66b8562ba8fb5998fd73b85b6e76956ad63e94f4"} Nov 25 15:46:42 crc kubenswrapper[4704]: I1125 15:46:42.349435 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" event={"ID":"940d9ea0-d9cc-4cf9-9211-34ca72879d09","Type":"ContainerStarted","Data":"06c6e07b71a77529bfa5f8edc926e3addd13214d8a29cab6608e34fd8e2adf73"} Nov 25 15:46:42 crc kubenswrapper[4704]: I1125 15:46:42.349455 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" event={"ID":"940d9ea0-d9cc-4cf9-9211-34ca72879d09","Type":"ContainerStarted","Data":"eebcfa8fcbc14698bfbc6cfe386400ec858d5497c5d8fcba670e48f0c26d3ac5"} Nov 25 15:46:42 crc kubenswrapper[4704]: I1125 15:46:42.349475 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" event={"ID":"940d9ea0-d9cc-4cf9-9211-34ca72879d09","Type":"ContainerStarted","Data":"f0f968f1115aa023405064b8db6f0b7cb297c075912fe811726af1f1bd5a93ca"} Nov 25 15:46:44 crc kubenswrapper[4704]: I1125 15:46:44.365277 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" event={"ID":"940d9ea0-d9cc-4cf9-9211-34ca72879d09","Type":"ContainerStarted","Data":"24ce7c3ba0cd158b10ca38b9a6c0fd241664596044a63d0c29f30652ab027e65"} Nov 25 15:46:47 crc kubenswrapper[4704]: I1125 15:46:47.384775 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" event={"ID":"940d9ea0-d9cc-4cf9-9211-34ca72879d09","Type":"ContainerStarted","Data":"98a26e82f322b8edc79090cc04002fb1ff5b3542bf5ea74f1e06196ee66ceab5"} Nov 25 15:46:47 crc kubenswrapper[4704]: I1125 15:46:47.385822 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:47 crc kubenswrapper[4704]: I1125 15:46:47.385843 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:47 crc kubenswrapper[4704]: I1125 15:46:47.420978 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" podStartSLOduration=8.420951242 podStartE2EDuration="8.420951242s" podCreationTimestamp="2025-11-25 15:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:46:47.418204392 +0000 UTC m=+693.686478183" watchObservedRunningTime="2025-11-25 15:46:47.420951242 +0000 UTC m=+693.689225023" Nov 25 15:46:47 crc kubenswrapper[4704]: I1125 15:46:47.432591 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:48 crc kubenswrapper[4704]: I1125 15:46:48.390680 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:48 crc kubenswrapper[4704]: I1125 15:46:48.423844 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:46:53 crc kubenswrapper[4704]: I1125 15:46:53.416632 4704 scope.go:117] "RemoveContainer" containerID="6ebde3bce3ebb98df82c1e2217d50256663339143d5a82ad4958eeed412b4c81" Nov 25 15:46:53 crc kubenswrapper[4704]: E1125 15:46:53.417730 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-h92xm_openshift-multus(d2820ade-e9bd-4146-b275-0c3b7d0cb5aa)\"" pod="openshift-multus/multus-h92xm" podUID="d2820ade-e9bd-4146-b275-0c3b7d0cb5aa" Nov 25 15:47:04 crc kubenswrapper[4704]: I1125 15:47:04.252758 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh"] Nov 25 15:47:04 crc kubenswrapper[4704]: I1125 15:47:04.254545 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh" Nov 25 15:47:04 crc kubenswrapper[4704]: I1125 15:47:04.257942 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 25 15:47:04 crc kubenswrapper[4704]: I1125 15:47:04.263238 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh"] Nov 25 15:47:04 crc kubenswrapper[4704]: I1125 15:47:04.421546 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe45db4d-f0e6-4706-8d50-f9777e8aff80-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh\" (UID: \"fe45db4d-f0e6-4706-8d50-f9777e8aff80\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh" Nov 25 15:47:04 crc kubenswrapper[4704]: I1125 15:47:04.422106 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzfpv\" (UniqueName: \"kubernetes.io/projected/fe45db4d-f0e6-4706-8d50-f9777e8aff80-kube-api-access-pzfpv\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh\" (UID: \"fe45db4d-f0e6-4706-8d50-f9777e8aff80\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh" Nov 25 15:47:04 crc kubenswrapper[4704]: I1125 15:47:04.422164 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe45db4d-f0e6-4706-8d50-f9777e8aff80-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh\" (UID: \"fe45db4d-f0e6-4706-8d50-f9777e8aff80\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh" Nov 25 15:47:04 crc kubenswrapper[4704]: I1125 15:47:04.523905 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe45db4d-f0e6-4706-8d50-f9777e8aff80-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh\" (UID: \"fe45db4d-f0e6-4706-8d50-f9777e8aff80\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh" Nov 25 15:47:04 crc kubenswrapper[4704]: I1125 15:47:04.523972 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe45db4d-f0e6-4706-8d50-f9777e8aff80-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh\" (UID: \"fe45db4d-f0e6-4706-8d50-f9777e8aff80\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh" Nov 25 15:47:04 crc kubenswrapper[4704]: I1125 15:47:04.524068 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzfpv\" (UniqueName: \"kubernetes.io/projected/fe45db4d-f0e6-4706-8d50-f9777e8aff80-kube-api-access-pzfpv\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh\" (UID: \"fe45db4d-f0e6-4706-8d50-f9777e8aff80\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh" Nov 25 15:47:04 crc kubenswrapper[4704]: I1125 15:47:04.524772 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe45db4d-f0e6-4706-8d50-f9777e8aff80-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh\" (UID: \"fe45db4d-f0e6-4706-8d50-f9777e8aff80\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh" Nov 25 15:47:04 crc kubenswrapper[4704]: I1125 15:47:04.524842 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe45db4d-f0e6-4706-8d50-f9777e8aff80-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh\" (UID: \"fe45db4d-f0e6-4706-8d50-f9777e8aff80\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh" Nov 25 15:47:04 crc kubenswrapper[4704]: I1125 15:47:04.548480 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzfpv\" (UniqueName: \"kubernetes.io/projected/fe45db4d-f0e6-4706-8d50-f9777e8aff80-kube-api-access-pzfpv\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh\" (UID: \"fe45db4d-f0e6-4706-8d50-f9777e8aff80\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh" Nov 25 15:47:04 crc kubenswrapper[4704]: I1125 15:47:04.582261 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh" Nov 25 15:47:04 crc kubenswrapper[4704]: E1125 15:47:04.608130 4704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh_openshift-marketplace_fe45db4d-f0e6-4706-8d50-f9777e8aff80_0(df519f7bb15aa02327b9a39c9bdb439fa20fb3b185d04a5404f26d769fd75017): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 25 15:47:04 crc kubenswrapper[4704]: E1125 15:47:04.608230 4704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh_openshift-marketplace_fe45db4d-f0e6-4706-8d50-f9777e8aff80_0(df519f7bb15aa02327b9a39c9bdb439fa20fb3b185d04a5404f26d769fd75017): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh" Nov 25 15:47:04 crc kubenswrapper[4704]: E1125 15:47:04.608258 4704 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh_openshift-marketplace_fe45db4d-f0e6-4706-8d50-f9777e8aff80_0(df519f7bb15aa02327b9a39c9bdb439fa20fb3b185d04a5404f26d769fd75017): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh" Nov 25 15:47:04 crc kubenswrapper[4704]: E1125 15:47:04.608313 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh_openshift-marketplace(fe45db4d-f0e6-4706-8d50-f9777e8aff80)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh_openshift-marketplace(fe45db4d-f0e6-4706-8d50-f9777e8aff80)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh_openshift-marketplace_fe45db4d-f0e6-4706-8d50-f9777e8aff80_0(df519f7bb15aa02327b9a39c9bdb439fa20fb3b185d04a5404f26d769fd75017): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh" podUID="fe45db4d-f0e6-4706-8d50-f9777e8aff80" Nov 25 15:47:05 crc kubenswrapper[4704]: I1125 15:47:05.473347 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh" Nov 25 15:47:05 crc kubenswrapper[4704]: I1125 15:47:05.473821 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh" Nov 25 15:47:05 crc kubenswrapper[4704]: E1125 15:47:05.502422 4704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh_openshift-marketplace_fe45db4d-f0e6-4706-8d50-f9777e8aff80_0(716ebf17ab713f87b498a6ea4a4b880ff6ee667537996ed1a533abf6264bde1f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 25 15:47:05 crc kubenswrapper[4704]: E1125 15:47:05.502533 4704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh_openshift-marketplace_fe45db4d-f0e6-4706-8d50-f9777e8aff80_0(716ebf17ab713f87b498a6ea4a4b880ff6ee667537996ed1a533abf6264bde1f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh" Nov 25 15:47:05 crc kubenswrapper[4704]: E1125 15:47:05.502612 4704 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh_openshift-marketplace_fe45db4d-f0e6-4706-8d50-f9777e8aff80_0(716ebf17ab713f87b498a6ea4a4b880ff6ee667537996ed1a533abf6264bde1f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh" Nov 25 15:47:05 crc kubenswrapper[4704]: E1125 15:47:05.502701 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh_openshift-marketplace(fe45db4d-f0e6-4706-8d50-f9777e8aff80)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh_openshift-marketplace(fe45db4d-f0e6-4706-8d50-f9777e8aff80)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh_openshift-marketplace_fe45db4d-f0e6-4706-8d50-f9777e8aff80_0(716ebf17ab713f87b498a6ea4a4b880ff6ee667537996ed1a533abf6264bde1f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh" podUID="fe45db4d-f0e6-4706-8d50-f9777e8aff80" Nov 25 15:47:06 crc kubenswrapper[4704]: I1125 15:47:06.416755 4704 scope.go:117] "RemoveContainer" containerID="6ebde3bce3ebb98df82c1e2217d50256663339143d5a82ad4958eeed412b4c81" Nov 25 15:47:07 crc kubenswrapper[4704]: I1125 15:47:07.487266 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h92xm_d2820ade-e9bd-4146-b275-0c3b7d0cb5aa/kube-multus/2.log" Nov 25 15:47:07 crc kubenswrapper[4704]: I1125 15:47:07.487727 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h92xm" event={"ID":"d2820ade-e9bd-4146-b275-0c3b7d0cb5aa","Type":"ContainerStarted","Data":"c2bfea66c44a72e4e265d2a96499d06c30e7506f2d83916765d823fbf7591ece"} Nov 25 15:47:10 crc kubenswrapper[4704]: I1125 15:47:10.336745 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sc8dr" Nov 25 15:47:21 crc kubenswrapper[4704]: I1125 15:47:21.416062 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh" Nov 25 15:47:21 crc kubenswrapper[4704]: I1125 15:47:21.418664 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh" Nov 25 15:47:21 crc kubenswrapper[4704]: I1125 15:47:21.596158 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh"] Nov 25 15:47:21 crc kubenswrapper[4704]: W1125 15:47:21.602094 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe45db4d_f0e6_4706_8d50_f9777e8aff80.slice/crio-1b5f549bdd3443ce67c357bffddfc17256811d3c1c361ca82f780fc75b3d7eea WatchSource:0}: Error finding container 1b5f549bdd3443ce67c357bffddfc17256811d3c1c361ca82f780fc75b3d7eea: Status 404 returned error can't find the container with id 1b5f549bdd3443ce67c357bffddfc17256811d3c1c361ca82f780fc75b3d7eea Nov 25 15:47:22 crc kubenswrapper[4704]: I1125 15:47:22.578537 4704 generic.go:334] "Generic (PLEG): container finished" podID="fe45db4d-f0e6-4706-8d50-f9777e8aff80" containerID="722003328f07663e8417efef86d7517c7d80fb9f5f13c896a26c0d7223278ba7" exitCode=0 Nov 25 15:47:22 crc kubenswrapper[4704]: I1125 15:47:22.578610 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh" event={"ID":"fe45db4d-f0e6-4706-8d50-f9777e8aff80","Type":"ContainerDied","Data":"722003328f07663e8417efef86d7517c7d80fb9f5f13c896a26c0d7223278ba7"} Nov 25 15:47:22 crc kubenswrapper[4704]: I1125 15:47:22.579182 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh" event={"ID":"fe45db4d-f0e6-4706-8d50-f9777e8aff80","Type":"ContainerStarted","Data":"1b5f549bdd3443ce67c357bffddfc17256811d3c1c361ca82f780fc75b3d7eea"} Nov 25 15:47:22 crc kubenswrapper[4704]: I1125 15:47:22.580652 4704 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 15:47:24 crc kubenswrapper[4704]: I1125 15:47:24.593122 4704 generic.go:334] "Generic (PLEG): container finished" podID="fe45db4d-f0e6-4706-8d50-f9777e8aff80" containerID="1ba111712b57e747fbede72ec87274487a1867582d89dad5fb9fb2c4964fefe7" exitCode=0 Nov 25 15:47:24 crc kubenswrapper[4704]: I1125 15:47:24.593449 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh" event={"ID":"fe45db4d-f0e6-4706-8d50-f9777e8aff80","Type":"ContainerDied","Data":"1ba111712b57e747fbede72ec87274487a1867582d89dad5fb9fb2c4964fefe7"} Nov 25 15:47:25 crc kubenswrapper[4704]: I1125 15:47:25.601029 4704 generic.go:334] "Generic (PLEG): container finished" podID="fe45db4d-f0e6-4706-8d50-f9777e8aff80" containerID="cfaa8992d6c0ed785efe9a6b306a63a627139dd386468ad0d7440b1e54379414" exitCode=0 Nov 25 15:47:25 crc kubenswrapper[4704]: I1125 15:47:25.601082 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh" event={"ID":"fe45db4d-f0e6-4706-8d50-f9777e8aff80","Type":"ContainerDied","Data":"cfaa8992d6c0ed785efe9a6b306a63a627139dd386468ad0d7440b1e54379414"} Nov 25 15:47:26 crc kubenswrapper[4704]: I1125 15:47:26.828902 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh" Nov 25 15:47:26 crc kubenswrapper[4704]: I1125 15:47:26.940840 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzfpv\" (UniqueName: \"kubernetes.io/projected/fe45db4d-f0e6-4706-8d50-f9777e8aff80-kube-api-access-pzfpv\") pod \"fe45db4d-f0e6-4706-8d50-f9777e8aff80\" (UID: \"fe45db4d-f0e6-4706-8d50-f9777e8aff80\") " Nov 25 15:47:26 crc kubenswrapper[4704]: I1125 15:47:26.940953 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe45db4d-f0e6-4706-8d50-f9777e8aff80-util\") pod \"fe45db4d-f0e6-4706-8d50-f9777e8aff80\" (UID: \"fe45db4d-f0e6-4706-8d50-f9777e8aff80\") " Nov 25 15:47:26 crc kubenswrapper[4704]: I1125 15:47:26.941038 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe45db4d-f0e6-4706-8d50-f9777e8aff80-bundle\") pod \"fe45db4d-f0e6-4706-8d50-f9777e8aff80\" (UID: \"fe45db4d-f0e6-4706-8d50-f9777e8aff80\") " Nov 25 15:47:26 crc kubenswrapper[4704]: I1125 15:47:26.942404 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe45db4d-f0e6-4706-8d50-f9777e8aff80-bundle" (OuterVolumeSpecName: "bundle") pod "fe45db4d-f0e6-4706-8d50-f9777e8aff80" (UID: "fe45db4d-f0e6-4706-8d50-f9777e8aff80"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:47:26 crc kubenswrapper[4704]: I1125 15:47:26.942880 4704 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe45db4d-f0e6-4706-8d50-f9777e8aff80-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:47:26 crc kubenswrapper[4704]: I1125 15:47:26.950079 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe45db4d-f0e6-4706-8d50-f9777e8aff80-kube-api-access-pzfpv" (OuterVolumeSpecName: "kube-api-access-pzfpv") pod "fe45db4d-f0e6-4706-8d50-f9777e8aff80" (UID: "fe45db4d-f0e6-4706-8d50-f9777e8aff80"). InnerVolumeSpecName "kube-api-access-pzfpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:47:26 crc kubenswrapper[4704]: I1125 15:47:26.956957 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe45db4d-f0e6-4706-8d50-f9777e8aff80-util" (OuterVolumeSpecName: "util") pod "fe45db4d-f0e6-4706-8d50-f9777e8aff80" (UID: "fe45db4d-f0e6-4706-8d50-f9777e8aff80"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:47:27 crc kubenswrapper[4704]: I1125 15:47:27.044577 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzfpv\" (UniqueName: \"kubernetes.io/projected/fe45db4d-f0e6-4706-8d50-f9777e8aff80-kube-api-access-pzfpv\") on node \"crc\" DevicePath \"\"" Nov 25 15:47:27 crc kubenswrapper[4704]: I1125 15:47:27.044655 4704 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe45db4d-f0e6-4706-8d50-f9777e8aff80-util\") on node \"crc\" DevicePath \"\"" Nov 25 15:47:27 crc kubenswrapper[4704]: I1125 15:47:27.614314 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh" event={"ID":"fe45db4d-f0e6-4706-8d50-f9777e8aff80","Type":"ContainerDied","Data":"1b5f549bdd3443ce67c357bffddfc17256811d3c1c361ca82f780fc75b3d7eea"} Nov 25 15:47:27 crc kubenswrapper[4704]: I1125 15:47:27.614367 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b5f549bdd3443ce67c357bffddfc17256811d3c1c361ca82f780fc75b3d7eea" Nov 25 15:47:27 crc kubenswrapper[4704]: I1125 15:47:27.614452 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.493538 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5f4957f9b7-lcxrs"] Nov 25 15:47:37 crc kubenswrapper[4704]: E1125 15:47:37.496130 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe45db4d-f0e6-4706-8d50-f9777e8aff80" containerName="util" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.496230 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe45db4d-f0e6-4706-8d50-f9777e8aff80" containerName="util" Nov 25 15:47:37 crc kubenswrapper[4704]: E1125 15:47:37.496323 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe45db4d-f0e6-4706-8d50-f9777e8aff80" containerName="extract" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.496393 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe45db4d-f0e6-4706-8d50-f9777e8aff80" containerName="extract" Nov 25 15:47:37 crc kubenswrapper[4704]: E1125 15:47:37.496480 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe45db4d-f0e6-4706-8d50-f9777e8aff80" containerName="pull" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.496554 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe45db4d-f0e6-4706-8d50-f9777e8aff80" containerName="pull" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.496750 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe45db4d-f0e6-4706-8d50-f9777e8aff80" containerName="extract" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.497273 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5f4957f9b7-lcxrs" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.499874 4704 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.500097 4704 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-wmzmk" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.500191 4704 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.500497 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.500867 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.513000 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5f4957f9b7-lcxrs"] Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.594460 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkn4t\" (UniqueName: \"kubernetes.io/projected/d7e15179-0dfe-4339-a233-4ebea59bc0f6-kube-api-access-rkn4t\") pod \"metallb-operator-controller-manager-5f4957f9b7-lcxrs\" (UID: \"d7e15179-0dfe-4339-a233-4ebea59bc0f6\") " pod="metallb-system/metallb-operator-controller-manager-5f4957f9b7-lcxrs" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.594535 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d7e15179-0dfe-4339-a233-4ebea59bc0f6-apiservice-cert\") pod \"metallb-operator-controller-manager-5f4957f9b7-lcxrs\" (UID: \"d7e15179-0dfe-4339-a233-4ebea59bc0f6\") " pod="metallb-system/metallb-operator-controller-manager-5f4957f9b7-lcxrs" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.594592 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d7e15179-0dfe-4339-a233-4ebea59bc0f6-webhook-cert\") pod \"metallb-operator-controller-manager-5f4957f9b7-lcxrs\" (UID: \"d7e15179-0dfe-4339-a233-4ebea59bc0f6\") " pod="metallb-system/metallb-operator-controller-manager-5f4957f9b7-lcxrs" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.696337 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d7e15179-0dfe-4339-a233-4ebea59bc0f6-apiservice-cert\") pod \"metallb-operator-controller-manager-5f4957f9b7-lcxrs\" (UID: \"d7e15179-0dfe-4339-a233-4ebea59bc0f6\") " pod="metallb-system/metallb-operator-controller-manager-5f4957f9b7-lcxrs" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.696393 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d7e15179-0dfe-4339-a233-4ebea59bc0f6-webhook-cert\") pod \"metallb-operator-controller-manager-5f4957f9b7-lcxrs\" (UID: \"d7e15179-0dfe-4339-a233-4ebea59bc0f6\") " pod="metallb-system/metallb-operator-controller-manager-5f4957f9b7-lcxrs" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.696489 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkn4t\" (UniqueName: \"kubernetes.io/projected/d7e15179-0dfe-4339-a233-4ebea59bc0f6-kube-api-access-rkn4t\") pod \"metallb-operator-controller-manager-5f4957f9b7-lcxrs\" (UID: \"d7e15179-0dfe-4339-a233-4ebea59bc0f6\") " pod="metallb-system/metallb-operator-controller-manager-5f4957f9b7-lcxrs" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.705553 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d7e15179-0dfe-4339-a233-4ebea59bc0f6-webhook-cert\") pod \"metallb-operator-controller-manager-5f4957f9b7-lcxrs\" (UID: \"d7e15179-0dfe-4339-a233-4ebea59bc0f6\") " pod="metallb-system/metallb-operator-controller-manager-5f4957f9b7-lcxrs" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.706487 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d7e15179-0dfe-4339-a233-4ebea59bc0f6-apiservice-cert\") pod \"metallb-operator-controller-manager-5f4957f9b7-lcxrs\" (UID: \"d7e15179-0dfe-4339-a233-4ebea59bc0f6\") " pod="metallb-system/metallb-operator-controller-manager-5f4957f9b7-lcxrs" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.718309 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkn4t\" (UniqueName: \"kubernetes.io/projected/d7e15179-0dfe-4339-a233-4ebea59bc0f6-kube-api-access-rkn4t\") pod \"metallb-operator-controller-manager-5f4957f9b7-lcxrs\" (UID: \"d7e15179-0dfe-4339-a233-4ebea59bc0f6\") " pod="metallb-system/metallb-operator-controller-manager-5f4957f9b7-lcxrs" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.772972 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-65c6dc9bcf-5m5qc"] Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.773643 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-65c6dc9bcf-5m5qc" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.776130 4704 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.776357 4704 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-bmv54" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.777154 4704 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.798043 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbqfc\" (UniqueName: \"kubernetes.io/projected/ca5ed8d1-6524-4d56-a49a-afb3cf8a5320-kube-api-access-sbqfc\") pod \"metallb-operator-webhook-server-65c6dc9bcf-5m5qc\" (UID: \"ca5ed8d1-6524-4d56-a49a-afb3cf8a5320\") " pod="metallb-system/metallb-operator-webhook-server-65c6dc9bcf-5m5qc" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.798131 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca5ed8d1-6524-4d56-a49a-afb3cf8a5320-webhook-cert\") pod \"metallb-operator-webhook-server-65c6dc9bcf-5m5qc\" (UID: \"ca5ed8d1-6524-4d56-a49a-afb3cf8a5320\") " pod="metallb-system/metallb-operator-webhook-server-65c6dc9bcf-5m5qc" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.798163 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca5ed8d1-6524-4d56-a49a-afb3cf8a5320-apiservice-cert\") pod \"metallb-operator-webhook-server-65c6dc9bcf-5m5qc\" (UID: \"ca5ed8d1-6524-4d56-a49a-afb3cf8a5320\") " pod="metallb-system/metallb-operator-webhook-server-65c6dc9bcf-5m5qc" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.804409 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-65c6dc9bcf-5m5qc"] Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.815365 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5f4957f9b7-lcxrs" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.900310 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca5ed8d1-6524-4d56-a49a-afb3cf8a5320-webhook-cert\") pod \"metallb-operator-webhook-server-65c6dc9bcf-5m5qc\" (UID: \"ca5ed8d1-6524-4d56-a49a-afb3cf8a5320\") " pod="metallb-system/metallb-operator-webhook-server-65c6dc9bcf-5m5qc" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.900373 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca5ed8d1-6524-4d56-a49a-afb3cf8a5320-apiservice-cert\") pod \"metallb-operator-webhook-server-65c6dc9bcf-5m5qc\" (UID: \"ca5ed8d1-6524-4d56-a49a-afb3cf8a5320\") " pod="metallb-system/metallb-operator-webhook-server-65c6dc9bcf-5m5qc" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.900435 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbqfc\" (UniqueName: \"kubernetes.io/projected/ca5ed8d1-6524-4d56-a49a-afb3cf8a5320-kube-api-access-sbqfc\") pod \"metallb-operator-webhook-server-65c6dc9bcf-5m5qc\" (UID: \"ca5ed8d1-6524-4d56-a49a-afb3cf8a5320\") " pod="metallb-system/metallb-operator-webhook-server-65c6dc9bcf-5m5qc" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.923733 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca5ed8d1-6524-4d56-a49a-afb3cf8a5320-apiservice-cert\") pod \"metallb-operator-webhook-server-65c6dc9bcf-5m5qc\" (UID: \"ca5ed8d1-6524-4d56-a49a-afb3cf8a5320\") " pod="metallb-system/metallb-operator-webhook-server-65c6dc9bcf-5m5qc" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.929516 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca5ed8d1-6524-4d56-a49a-afb3cf8a5320-webhook-cert\") pod \"metallb-operator-webhook-server-65c6dc9bcf-5m5qc\" (UID: \"ca5ed8d1-6524-4d56-a49a-afb3cf8a5320\") " pod="metallb-system/metallb-operator-webhook-server-65c6dc9bcf-5m5qc" Nov 25 15:47:37 crc kubenswrapper[4704]: I1125 15:47:37.941873 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbqfc\" (UniqueName: \"kubernetes.io/projected/ca5ed8d1-6524-4d56-a49a-afb3cf8a5320-kube-api-access-sbqfc\") pod \"metallb-operator-webhook-server-65c6dc9bcf-5m5qc\" (UID: \"ca5ed8d1-6524-4d56-a49a-afb3cf8a5320\") " pod="metallb-system/metallb-operator-webhook-server-65c6dc9bcf-5m5qc" Nov 25 15:47:38 crc kubenswrapper[4704]: I1125 15:47:38.091313 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-65c6dc9bcf-5m5qc" Nov 25 15:47:38 crc kubenswrapper[4704]: I1125 15:47:38.352330 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-65c6dc9bcf-5m5qc"] Nov 25 15:47:38 crc kubenswrapper[4704]: I1125 15:47:38.357065 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5f4957f9b7-lcxrs"] Nov 25 15:47:38 crc kubenswrapper[4704]: W1125 15:47:38.361249 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7e15179_0dfe_4339_a233_4ebea59bc0f6.slice/crio-04462d22327239caf3d83245442d2ffdd882a14aa3a3cf04767545e70d321a40 WatchSource:0}: Error finding container 04462d22327239caf3d83245442d2ffdd882a14aa3a3cf04767545e70d321a40: Status 404 returned error can't find the container with id 04462d22327239caf3d83245442d2ffdd882a14aa3a3cf04767545e70d321a40 Nov 25 15:47:38 crc kubenswrapper[4704]: I1125 15:47:38.669873 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-65c6dc9bcf-5m5qc" event={"ID":"ca5ed8d1-6524-4d56-a49a-afb3cf8a5320","Type":"ContainerStarted","Data":"40554151102d2bc6efc30be16cf2a47aa19278924561a7edaa5e110d28b93d17"} Nov 25 15:47:38 crc kubenswrapper[4704]: I1125 15:47:38.671942 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5f4957f9b7-lcxrs" event={"ID":"d7e15179-0dfe-4339-a233-4ebea59bc0f6","Type":"ContainerStarted","Data":"04462d22327239caf3d83245442d2ffdd882a14aa3a3cf04767545e70d321a40"} Nov 25 15:47:44 crc kubenswrapper[4704]: I1125 15:47:44.714355 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5f4957f9b7-lcxrs" event={"ID":"d7e15179-0dfe-4339-a233-4ebea59bc0f6","Type":"ContainerStarted","Data":"ab7d8164f63d9b59a54e16ba15b5b493a0d9529cd71bd23dc5ac1aae9db96e14"} Nov 25 15:47:44 crc kubenswrapper[4704]: I1125 15:47:44.716501 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5f4957f9b7-lcxrs" Nov 25 15:47:44 crc kubenswrapper[4704]: I1125 15:47:44.719194 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-65c6dc9bcf-5m5qc" event={"ID":"ca5ed8d1-6524-4d56-a49a-afb3cf8a5320","Type":"ContainerStarted","Data":"7b5a478ce7cd0e1d4ef6263ba88f985959f918c49db23980d61d7b103cc300e4"} Nov 25 15:47:44 crc kubenswrapper[4704]: I1125 15:47:44.719404 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-65c6dc9bcf-5m5qc" Nov 25 15:47:44 crc kubenswrapper[4704]: I1125 15:47:44.750587 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5f4957f9b7-lcxrs" podStartSLOduration=4.030128394 podStartE2EDuration="7.75055885s" podCreationTimestamp="2025-11-25 15:47:37 +0000 UTC" firstStartedPulling="2025-11-25 15:47:38.363116903 +0000 UTC m=+744.631390684" lastFinishedPulling="2025-11-25 15:47:42.083547359 +0000 UTC m=+748.351821140" observedRunningTime="2025-11-25 15:47:44.748307757 +0000 UTC m=+751.016581558" watchObservedRunningTime="2025-11-25 15:47:44.75055885 +0000 UTC m=+751.018832631" Nov 25 15:47:44 crc kubenswrapper[4704]: I1125 15:47:44.780685 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-65c6dc9bcf-5m5qc" podStartSLOduration=2.300985949 podStartE2EDuration="7.780653005s" podCreationTimestamp="2025-11-25 15:47:37 +0000 UTC" firstStartedPulling="2025-11-25 15:47:38.362550497 +0000 UTC m=+744.630824278" lastFinishedPulling="2025-11-25 15:47:43.842217553 +0000 UTC m=+750.110491334" observedRunningTime="2025-11-25 15:47:44.780211513 +0000 UTC m=+751.048485304" watchObservedRunningTime="2025-11-25 15:47:44.780653005 +0000 UTC m=+751.048926796" Nov 25 15:47:46 crc kubenswrapper[4704]: I1125 15:47:46.702023 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gc5rd"] Nov 25 15:47:46 crc kubenswrapper[4704]: I1125 15:47:46.702718 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" podUID="4ce56dcb-a916-41ca-b706-df5e157576eb" containerName="controller-manager" containerID="cri-o://5319d1cc3d02a76721dc0a8427195186872b40d7749d0d58fcb409e37e41d813" gracePeriod=30 Nov 25 15:47:46 crc kubenswrapper[4704]: I1125 15:47:46.814253 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s"] Nov 25 15:47:46 crc kubenswrapper[4704]: I1125 15:47:46.814603 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s" podUID="49cc56f5-b9bb-4694-8965-0e7c6a4aaae6" containerName="route-controller-manager" containerID="cri-o://a6281e5f57d6109c78ed7a3af74e9dc80ab0e1e2090aa8f90a37744dabd523f7" gracePeriod=30 Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.104585 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.140993 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ce56dcb-a916-41ca-b706-df5e157576eb-client-ca\") pod \"4ce56dcb-a916-41ca-b706-df5e157576eb\" (UID: \"4ce56dcb-a916-41ca-b706-df5e157576eb\") " Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.141071 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbzd6\" (UniqueName: \"kubernetes.io/projected/4ce56dcb-a916-41ca-b706-df5e157576eb-kube-api-access-bbzd6\") pod \"4ce56dcb-a916-41ca-b706-df5e157576eb\" (UID: \"4ce56dcb-a916-41ca-b706-df5e157576eb\") " Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.141097 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ce56dcb-a916-41ca-b706-df5e157576eb-serving-cert\") pod \"4ce56dcb-a916-41ca-b706-df5e157576eb\" (UID: \"4ce56dcb-a916-41ca-b706-df5e157576eb\") " Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.141128 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ce56dcb-a916-41ca-b706-df5e157576eb-proxy-ca-bundles\") pod \"4ce56dcb-a916-41ca-b706-df5e157576eb\" (UID: \"4ce56dcb-a916-41ca-b706-df5e157576eb\") " Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.141173 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ce56dcb-a916-41ca-b706-df5e157576eb-config\") pod \"4ce56dcb-a916-41ca-b706-df5e157576eb\" (UID: \"4ce56dcb-a916-41ca-b706-df5e157576eb\") " Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.148584 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ce56dcb-a916-41ca-b706-df5e157576eb-client-ca" (OuterVolumeSpecName: "client-ca") pod "4ce56dcb-a916-41ca-b706-df5e157576eb" (UID: "4ce56dcb-a916-41ca-b706-df5e157576eb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.163901 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ce56dcb-a916-41ca-b706-df5e157576eb-config" (OuterVolumeSpecName: "config") pod "4ce56dcb-a916-41ca-b706-df5e157576eb" (UID: "4ce56dcb-a916-41ca-b706-df5e157576eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.165269 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ce56dcb-a916-41ca-b706-df5e157576eb-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4ce56dcb-a916-41ca-b706-df5e157576eb" (UID: "4ce56dcb-a916-41ca-b706-df5e157576eb"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.167109 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ce56dcb-a916-41ca-b706-df5e157576eb-kube-api-access-bbzd6" (OuterVolumeSpecName: "kube-api-access-bbzd6") pod "4ce56dcb-a916-41ca-b706-df5e157576eb" (UID: "4ce56dcb-a916-41ca-b706-df5e157576eb"). InnerVolumeSpecName "kube-api-access-bbzd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.175595 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ce56dcb-a916-41ca-b706-df5e157576eb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4ce56dcb-a916-41ca-b706-df5e157576eb" (UID: "4ce56dcb-a916-41ca-b706-df5e157576eb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.242187 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbzd6\" (UniqueName: \"kubernetes.io/projected/4ce56dcb-a916-41ca-b706-df5e157576eb-kube-api-access-bbzd6\") on node \"crc\" DevicePath \"\"" Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.242244 4704 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ce56dcb-a916-41ca-b706-df5e157576eb-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.242257 4704 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ce56dcb-a916-41ca-b706-df5e157576eb-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.242266 4704 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ce56dcb-a916-41ca-b706-df5e157576eb-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.242275 4704 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ce56dcb-a916-41ca-b706-df5e157576eb-client-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.282004 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s" Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.443512 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49cc56f5-b9bb-4694-8965-0e7c6a4aaae6-config\") pod \"49cc56f5-b9bb-4694-8965-0e7c6a4aaae6\" (UID: \"49cc56f5-b9bb-4694-8965-0e7c6a4aaae6\") " Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.443846 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49cc56f5-b9bb-4694-8965-0e7c6a4aaae6-client-ca\") pod \"49cc56f5-b9bb-4694-8965-0e7c6a4aaae6\" (UID: \"49cc56f5-b9bb-4694-8965-0e7c6a4aaae6\") " Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.443940 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6sq2\" (UniqueName: \"kubernetes.io/projected/49cc56f5-b9bb-4694-8965-0e7c6a4aaae6-kube-api-access-l6sq2\") pod \"49cc56f5-b9bb-4694-8965-0e7c6a4aaae6\" (UID: \"49cc56f5-b9bb-4694-8965-0e7c6a4aaae6\") " Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.443971 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49cc56f5-b9bb-4694-8965-0e7c6a4aaae6-serving-cert\") pod \"49cc56f5-b9bb-4694-8965-0e7c6a4aaae6\" (UID: \"49cc56f5-b9bb-4694-8965-0e7c6a4aaae6\") " Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.444412 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49cc56f5-b9bb-4694-8965-0e7c6a4aaae6-config" (OuterVolumeSpecName: "config") pod "49cc56f5-b9bb-4694-8965-0e7c6a4aaae6" (UID: "49cc56f5-b9bb-4694-8965-0e7c6a4aaae6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.444541 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49cc56f5-b9bb-4694-8965-0e7c6a4aaae6-client-ca" (OuterVolumeSpecName: "client-ca") pod "49cc56f5-b9bb-4694-8965-0e7c6a4aaae6" (UID: "49cc56f5-b9bb-4694-8965-0e7c6a4aaae6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.447192 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49cc56f5-b9bb-4694-8965-0e7c6a4aaae6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "49cc56f5-b9bb-4694-8965-0e7c6a4aaae6" (UID: "49cc56f5-b9bb-4694-8965-0e7c6a4aaae6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.447256 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49cc56f5-b9bb-4694-8965-0e7c6a4aaae6-kube-api-access-l6sq2" (OuterVolumeSpecName: "kube-api-access-l6sq2") pod "49cc56f5-b9bb-4694-8965-0e7c6a4aaae6" (UID: "49cc56f5-b9bb-4694-8965-0e7c6a4aaae6"). InnerVolumeSpecName "kube-api-access-l6sq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.545246 4704 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49cc56f5-b9bb-4694-8965-0e7c6a4aaae6-client-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.545297 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6sq2\" (UniqueName: \"kubernetes.io/projected/49cc56f5-b9bb-4694-8965-0e7c6a4aaae6-kube-api-access-l6sq2\") on node \"crc\" DevicePath \"\"" Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.545431 4704 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49cc56f5-b9bb-4694-8965-0e7c6a4aaae6-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.545450 4704 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49cc56f5-b9bb-4694-8965-0e7c6a4aaae6-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.735286 4704 generic.go:334] "Generic (PLEG): container finished" podID="49cc56f5-b9bb-4694-8965-0e7c6a4aaae6" containerID="a6281e5f57d6109c78ed7a3af74e9dc80ab0e1e2090aa8f90a37744dabd523f7" exitCode=0 Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.735392 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s" Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.735390 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s" event={"ID":"49cc56f5-b9bb-4694-8965-0e7c6a4aaae6","Type":"ContainerDied","Data":"a6281e5f57d6109c78ed7a3af74e9dc80ab0e1e2090aa8f90a37744dabd523f7"} Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.735532 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s" event={"ID":"49cc56f5-b9bb-4694-8965-0e7c6a4aaae6","Type":"ContainerDied","Data":"15b0ee63a8404dad7e624111bc5f2aec382b9e345c36a329499906c38847a203"} Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.735556 4704 scope.go:117] "RemoveContainer" containerID="a6281e5f57d6109c78ed7a3af74e9dc80ab0e1e2090aa8f90a37744dabd523f7" Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.737402 4704 generic.go:334] "Generic (PLEG): container finished" podID="4ce56dcb-a916-41ca-b706-df5e157576eb" containerID="5319d1cc3d02a76721dc0a8427195186872b40d7749d0d58fcb409e37e41d813" exitCode=0 Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.737443 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.737462 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" event={"ID":"4ce56dcb-a916-41ca-b706-df5e157576eb","Type":"ContainerDied","Data":"5319d1cc3d02a76721dc0a8427195186872b40d7749d0d58fcb409e37e41d813"} Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.737543 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-gc5rd" event={"ID":"4ce56dcb-a916-41ca-b706-df5e157576eb","Type":"ContainerDied","Data":"279430a122b5f1dac181ab543d7959f7e06e979b1dc10ce71558bd5a618ce27e"} Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.764056 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s"] Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.765477 4704 scope.go:117] "RemoveContainer" containerID="a6281e5f57d6109c78ed7a3af74e9dc80ab0e1e2090aa8f90a37744dabd523f7" Nov 25 15:47:47 crc kubenswrapper[4704]: E1125 15:47:47.769012 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6281e5f57d6109c78ed7a3af74e9dc80ab0e1e2090aa8f90a37744dabd523f7\": container with ID starting with a6281e5f57d6109c78ed7a3af74e9dc80ab0e1e2090aa8f90a37744dabd523f7 not found: ID does not exist" containerID="a6281e5f57d6109c78ed7a3af74e9dc80ab0e1e2090aa8f90a37744dabd523f7" Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.769103 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6281e5f57d6109c78ed7a3af74e9dc80ab0e1e2090aa8f90a37744dabd523f7"} err="failed to get container status \"a6281e5f57d6109c78ed7a3af74e9dc80ab0e1e2090aa8f90a37744dabd523f7\": rpc error: code = NotFound desc = could not find container \"a6281e5f57d6109c78ed7a3af74e9dc80ab0e1e2090aa8f90a37744dabd523f7\": container with ID starting with a6281e5f57d6109c78ed7a3af74e9dc80ab0e1e2090aa8f90a37744dabd523f7 not found: ID does not exist" Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.769169 4704 scope.go:117] "RemoveContainer" containerID="5319d1cc3d02a76721dc0a8427195186872b40d7749d0d58fcb409e37e41d813" Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.773286 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6s99s"] Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.790708 4704 scope.go:117] "RemoveContainer" containerID="5319d1cc3d02a76721dc0a8427195186872b40d7749d0d58fcb409e37e41d813" Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.791024 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gc5rd"] Nov 25 15:47:47 crc kubenswrapper[4704]: E1125 15:47:47.791341 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5319d1cc3d02a76721dc0a8427195186872b40d7749d0d58fcb409e37e41d813\": container with ID starting with 5319d1cc3d02a76721dc0a8427195186872b40d7749d0d58fcb409e37e41d813 not found: ID does not exist" containerID="5319d1cc3d02a76721dc0a8427195186872b40d7749d0d58fcb409e37e41d813" Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.791376 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5319d1cc3d02a76721dc0a8427195186872b40d7749d0d58fcb409e37e41d813"} err="failed to get container status \"5319d1cc3d02a76721dc0a8427195186872b40d7749d0d58fcb409e37e41d813\": rpc error: code = NotFound desc = could not find container \"5319d1cc3d02a76721dc0a8427195186872b40d7749d0d58fcb409e37e41d813\": container with ID starting with 5319d1cc3d02a76721dc0a8427195186872b40d7749d0d58fcb409e37e41d813 not found: ID does not exist" Nov 25 15:47:47 crc kubenswrapper[4704]: I1125 15:47:47.796995 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gc5rd"] Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.423339 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49cc56f5-b9bb-4694-8965-0e7c6a4aaae6" path="/var/lib/kubelet/pods/49cc56f5-b9bb-4694-8965-0e7c6a4aaae6/volumes" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.424066 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ce56dcb-a916-41ca-b706-df5e157576eb" path="/var/lib/kubelet/pods/4ce56dcb-a916-41ca-b706-df5e157576eb/volumes" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.466894 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-675fd74d47-mjcln"] Nov 25 15:47:48 crc kubenswrapper[4704]: E1125 15:47:48.467272 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49cc56f5-b9bb-4694-8965-0e7c6a4aaae6" containerName="route-controller-manager" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.467292 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="49cc56f5-b9bb-4694-8965-0e7c6a4aaae6" containerName="route-controller-manager" Nov 25 15:47:48 crc kubenswrapper[4704]: E1125 15:47:48.467331 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce56dcb-a916-41ca-b706-df5e157576eb" containerName="controller-manager" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.467341 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce56dcb-a916-41ca-b706-df5e157576eb" containerName="controller-manager" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.467461 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ce56dcb-a916-41ca-b706-df5e157576eb" containerName="controller-manager" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.467494 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="49cc56f5-b9bb-4694-8965-0e7c6a4aaae6" containerName="route-controller-manager" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.468015 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-675fd74d47-mjcln" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.471674 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.472134 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.472323 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85cff65667-2bp8b"] Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.472670 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.473256 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.473694 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85cff65667-2bp8b" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.475281 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.476699 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.476899 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.476994 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.477288 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.477292 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.477723 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.485109 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.491919 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.498955 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-675fd74d47-mjcln"] Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.515607 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85cff65667-2bp8b"] Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.557947 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00970c7b-2728-489b-a73d-cae6bd1ee00b-proxy-ca-bundles\") pod \"controller-manager-675fd74d47-mjcln\" (UID: \"00970c7b-2728-489b-a73d-cae6bd1ee00b\") " pod="openshift-controller-manager/controller-manager-675fd74d47-mjcln" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.558065 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79f4af01-528d-4b01-9f40-b3942bc9c9d8-serving-cert\") pod \"route-controller-manager-85cff65667-2bp8b\" (UID: \"79f4af01-528d-4b01-9f40-b3942bc9c9d8\") " pod="openshift-route-controller-manager/route-controller-manager-85cff65667-2bp8b" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.558105 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00970c7b-2728-489b-a73d-cae6bd1ee00b-client-ca\") pod \"controller-manager-675fd74d47-mjcln\" (UID: \"00970c7b-2728-489b-a73d-cae6bd1ee00b\") " pod="openshift-controller-manager/controller-manager-675fd74d47-mjcln" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.558171 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/79f4af01-528d-4b01-9f40-b3942bc9c9d8-client-ca\") pod \"route-controller-manager-85cff65667-2bp8b\" (UID: \"79f4af01-528d-4b01-9f40-b3942bc9c9d8\") " pod="openshift-route-controller-manager/route-controller-manager-85cff65667-2bp8b" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.558266 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hplrb\" (UniqueName: \"kubernetes.io/projected/79f4af01-528d-4b01-9f40-b3942bc9c9d8-kube-api-access-hplrb\") pod \"route-controller-manager-85cff65667-2bp8b\" (UID: \"79f4af01-528d-4b01-9f40-b3942bc9c9d8\") " pod="openshift-route-controller-manager/route-controller-manager-85cff65667-2bp8b" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.558336 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79f4af01-528d-4b01-9f40-b3942bc9c9d8-config\") pod \"route-controller-manager-85cff65667-2bp8b\" (UID: \"79f4af01-528d-4b01-9f40-b3942bc9c9d8\") " pod="openshift-route-controller-manager/route-controller-manager-85cff65667-2bp8b" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.558387 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f45ct\" (UniqueName: \"kubernetes.io/projected/00970c7b-2728-489b-a73d-cae6bd1ee00b-kube-api-access-f45ct\") pod \"controller-manager-675fd74d47-mjcln\" (UID: \"00970c7b-2728-489b-a73d-cae6bd1ee00b\") " pod="openshift-controller-manager/controller-manager-675fd74d47-mjcln" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.558415 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00970c7b-2728-489b-a73d-cae6bd1ee00b-config\") pod \"controller-manager-675fd74d47-mjcln\" (UID: \"00970c7b-2728-489b-a73d-cae6bd1ee00b\") " pod="openshift-controller-manager/controller-manager-675fd74d47-mjcln" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.558474 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00970c7b-2728-489b-a73d-cae6bd1ee00b-serving-cert\") pod \"controller-manager-675fd74d47-mjcln\" (UID: \"00970c7b-2728-489b-a73d-cae6bd1ee00b\") " pod="openshift-controller-manager/controller-manager-675fd74d47-mjcln" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.560350 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-675fd74d47-mjcln"] Nov 25 15:47:48 crc kubenswrapper[4704]: E1125 15:47:48.560798 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-f45ct proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-675fd74d47-mjcln" podUID="00970c7b-2728-489b-a73d-cae6bd1ee00b" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.584649 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85cff65667-2bp8b"] Nov 25 15:47:48 crc kubenswrapper[4704]: E1125 15:47:48.585310 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-hplrb serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-85cff65667-2bp8b" podUID="79f4af01-528d-4b01-9f40-b3942bc9c9d8" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.658874 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00970c7b-2728-489b-a73d-cae6bd1ee00b-client-ca\") pod \"controller-manager-675fd74d47-mjcln\" (UID: \"00970c7b-2728-489b-a73d-cae6bd1ee00b\") " pod="openshift-controller-manager/controller-manager-675fd74d47-mjcln" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.659200 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/79f4af01-528d-4b01-9f40-b3942bc9c9d8-client-ca\") pod \"route-controller-manager-85cff65667-2bp8b\" (UID: \"79f4af01-528d-4b01-9f40-b3942bc9c9d8\") " pod="openshift-route-controller-manager/route-controller-manager-85cff65667-2bp8b" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.659288 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hplrb\" (UniqueName: \"kubernetes.io/projected/79f4af01-528d-4b01-9f40-b3942bc9c9d8-kube-api-access-hplrb\") pod \"route-controller-manager-85cff65667-2bp8b\" (UID: \"79f4af01-528d-4b01-9f40-b3942bc9c9d8\") " pod="openshift-route-controller-manager/route-controller-manager-85cff65667-2bp8b" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.659428 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79f4af01-528d-4b01-9f40-b3942bc9c9d8-config\") pod \"route-controller-manager-85cff65667-2bp8b\" (UID: \"79f4af01-528d-4b01-9f40-b3942bc9c9d8\") " pod="openshift-route-controller-manager/route-controller-manager-85cff65667-2bp8b" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.659590 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f45ct\" (UniqueName: \"kubernetes.io/projected/00970c7b-2728-489b-a73d-cae6bd1ee00b-kube-api-access-f45ct\") pod \"controller-manager-675fd74d47-mjcln\" (UID: \"00970c7b-2728-489b-a73d-cae6bd1ee00b\") " pod="openshift-controller-manager/controller-manager-675fd74d47-mjcln" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.660078 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00970c7b-2728-489b-a73d-cae6bd1ee00b-config\") pod \"controller-manager-675fd74d47-mjcln\" (UID: \"00970c7b-2728-489b-a73d-cae6bd1ee00b\") " pod="openshift-controller-manager/controller-manager-675fd74d47-mjcln" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.660327 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/79f4af01-528d-4b01-9f40-b3942bc9c9d8-client-ca\") pod \"route-controller-manager-85cff65667-2bp8b\" (UID: \"79f4af01-528d-4b01-9f40-b3942bc9c9d8\") " pod="openshift-route-controller-manager/route-controller-manager-85cff65667-2bp8b" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.660513 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00970c7b-2728-489b-a73d-cae6bd1ee00b-client-ca\") pod \"controller-manager-675fd74d47-mjcln\" (UID: \"00970c7b-2728-489b-a73d-cae6bd1ee00b\") " pod="openshift-controller-manager/controller-manager-675fd74d47-mjcln" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.660953 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79f4af01-528d-4b01-9f40-b3942bc9c9d8-config\") pod \"route-controller-manager-85cff65667-2bp8b\" (UID: \"79f4af01-528d-4b01-9f40-b3942bc9c9d8\") " pod="openshift-route-controller-manager/route-controller-manager-85cff65667-2bp8b" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.661394 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00970c7b-2728-489b-a73d-cae6bd1ee00b-config\") pod \"controller-manager-675fd74d47-mjcln\" (UID: \"00970c7b-2728-489b-a73d-cae6bd1ee00b\") " pod="openshift-controller-manager/controller-manager-675fd74d47-mjcln" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.661862 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00970c7b-2728-489b-a73d-cae6bd1ee00b-serving-cert\") pod \"controller-manager-675fd74d47-mjcln\" (UID: \"00970c7b-2728-489b-a73d-cae6bd1ee00b\") " pod="openshift-controller-manager/controller-manager-675fd74d47-mjcln" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.662029 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00970c7b-2728-489b-a73d-cae6bd1ee00b-proxy-ca-bundles\") pod \"controller-manager-675fd74d47-mjcln\" (UID: \"00970c7b-2728-489b-a73d-cae6bd1ee00b\") " pod="openshift-controller-manager/controller-manager-675fd74d47-mjcln" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.663597 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00970c7b-2728-489b-a73d-cae6bd1ee00b-proxy-ca-bundles\") pod \"controller-manager-675fd74d47-mjcln\" (UID: \"00970c7b-2728-489b-a73d-cae6bd1ee00b\") " pod="openshift-controller-manager/controller-manager-675fd74d47-mjcln" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.663867 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79f4af01-528d-4b01-9f40-b3942bc9c9d8-serving-cert\") pod \"route-controller-manager-85cff65667-2bp8b\" (UID: \"79f4af01-528d-4b01-9f40-b3942bc9c9d8\") " pod="openshift-route-controller-manager/route-controller-manager-85cff65667-2bp8b" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.667198 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00970c7b-2728-489b-a73d-cae6bd1ee00b-serving-cert\") pod \"controller-manager-675fd74d47-mjcln\" (UID: \"00970c7b-2728-489b-a73d-cae6bd1ee00b\") " pod="openshift-controller-manager/controller-manager-675fd74d47-mjcln" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.667628 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79f4af01-528d-4b01-9f40-b3942bc9c9d8-serving-cert\") pod \"route-controller-manager-85cff65667-2bp8b\" (UID: \"79f4af01-528d-4b01-9f40-b3942bc9c9d8\") " pod="openshift-route-controller-manager/route-controller-manager-85cff65667-2bp8b" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.675744 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hplrb\" (UniqueName: \"kubernetes.io/projected/79f4af01-528d-4b01-9f40-b3942bc9c9d8-kube-api-access-hplrb\") pod \"route-controller-manager-85cff65667-2bp8b\" (UID: \"79f4af01-528d-4b01-9f40-b3942bc9c9d8\") " pod="openshift-route-controller-manager/route-controller-manager-85cff65667-2bp8b" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.682208 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f45ct\" (UniqueName: \"kubernetes.io/projected/00970c7b-2728-489b-a73d-cae6bd1ee00b-kube-api-access-f45ct\") pod \"controller-manager-675fd74d47-mjcln\" (UID: \"00970c7b-2728-489b-a73d-cae6bd1ee00b\") " pod="openshift-controller-manager/controller-manager-675fd74d47-mjcln" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.743632 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-675fd74d47-mjcln" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.743962 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85cff65667-2bp8b" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.753373 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-675fd74d47-mjcln" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.758446 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85cff65667-2bp8b" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.765634 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hplrb\" (UniqueName: \"kubernetes.io/projected/79f4af01-528d-4b01-9f40-b3942bc9c9d8-kube-api-access-hplrb\") pod \"79f4af01-528d-4b01-9f40-b3942bc9c9d8\" (UID: \"79f4af01-528d-4b01-9f40-b3942bc9c9d8\") " Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.765711 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00970c7b-2728-489b-a73d-cae6bd1ee00b-proxy-ca-bundles\") pod \"00970c7b-2728-489b-a73d-cae6bd1ee00b\" (UID: \"00970c7b-2728-489b-a73d-cae6bd1ee00b\") " Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.765752 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00970c7b-2728-489b-a73d-cae6bd1ee00b-client-ca\") pod \"00970c7b-2728-489b-a73d-cae6bd1ee00b\" (UID: \"00970c7b-2728-489b-a73d-cae6bd1ee00b\") " Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.765803 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00970c7b-2728-489b-a73d-cae6bd1ee00b-config\") pod \"00970c7b-2728-489b-a73d-cae6bd1ee00b\" (UID: \"00970c7b-2728-489b-a73d-cae6bd1ee00b\") " Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.765841 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00970c7b-2728-489b-a73d-cae6bd1ee00b-serving-cert\") pod \"00970c7b-2728-489b-a73d-cae6bd1ee00b\" (UID: \"00970c7b-2728-489b-a73d-cae6bd1ee00b\") " Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.765878 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f45ct\" (UniqueName: \"kubernetes.io/projected/00970c7b-2728-489b-a73d-cae6bd1ee00b-kube-api-access-f45ct\") pod \"00970c7b-2728-489b-a73d-cae6bd1ee00b\" (UID: \"00970c7b-2728-489b-a73d-cae6bd1ee00b\") " Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.765955 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79f4af01-528d-4b01-9f40-b3942bc9c9d8-serving-cert\") pod \"79f4af01-528d-4b01-9f40-b3942bc9c9d8\" (UID: \"79f4af01-528d-4b01-9f40-b3942bc9c9d8\") " Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.766025 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/79f4af01-528d-4b01-9f40-b3942bc9c9d8-client-ca\") pod \"79f4af01-528d-4b01-9f40-b3942bc9c9d8\" (UID: \"79f4af01-528d-4b01-9f40-b3942bc9c9d8\") " Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.766053 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79f4af01-528d-4b01-9f40-b3942bc9c9d8-config\") pod \"79f4af01-528d-4b01-9f40-b3942bc9c9d8\" (UID: \"79f4af01-528d-4b01-9f40-b3942bc9c9d8\") " Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.766263 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00970c7b-2728-489b-a73d-cae6bd1ee00b-client-ca" (OuterVolumeSpecName: "client-ca") pod "00970c7b-2728-489b-a73d-cae6bd1ee00b" (UID: "00970c7b-2728-489b-a73d-cae6bd1ee00b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.766447 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00970c7b-2728-489b-a73d-cae6bd1ee00b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "00970c7b-2728-489b-a73d-cae6bd1ee00b" (UID: "00970c7b-2728-489b-a73d-cae6bd1ee00b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.766526 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00970c7b-2728-489b-a73d-cae6bd1ee00b-config" (OuterVolumeSpecName: "config") pod "00970c7b-2728-489b-a73d-cae6bd1ee00b" (UID: "00970c7b-2728-489b-a73d-cae6bd1ee00b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.766777 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79f4af01-528d-4b01-9f40-b3942bc9c9d8-config" (OuterVolumeSpecName: "config") pod "79f4af01-528d-4b01-9f40-b3942bc9c9d8" (UID: "79f4af01-528d-4b01-9f40-b3942bc9c9d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.766842 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79f4af01-528d-4b01-9f40-b3942bc9c9d8-client-ca" (OuterVolumeSpecName: "client-ca") pod "79f4af01-528d-4b01-9f40-b3942bc9c9d8" (UID: "79f4af01-528d-4b01-9f40-b3942bc9c9d8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.769173 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79f4af01-528d-4b01-9f40-b3942bc9c9d8-kube-api-access-hplrb" (OuterVolumeSpecName: "kube-api-access-hplrb") pod "79f4af01-528d-4b01-9f40-b3942bc9c9d8" (UID: "79f4af01-528d-4b01-9f40-b3942bc9c9d8"). InnerVolumeSpecName "kube-api-access-hplrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.770046 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00970c7b-2728-489b-a73d-cae6bd1ee00b-kube-api-access-f45ct" (OuterVolumeSpecName: "kube-api-access-f45ct") pod "00970c7b-2728-489b-a73d-cae6bd1ee00b" (UID: "00970c7b-2728-489b-a73d-cae6bd1ee00b"). InnerVolumeSpecName "kube-api-access-f45ct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.770378 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00970c7b-2728-489b-a73d-cae6bd1ee00b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "00970c7b-2728-489b-a73d-cae6bd1ee00b" (UID: "00970c7b-2728-489b-a73d-cae6bd1ee00b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.773991 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79f4af01-528d-4b01-9f40-b3942bc9c9d8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "79f4af01-528d-4b01-9f40-b3942bc9c9d8" (UID: "79f4af01-528d-4b01-9f40-b3942bc9c9d8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.867338 4704 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00970c7b-2728-489b-a73d-cae6bd1ee00b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.867660 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f45ct\" (UniqueName: \"kubernetes.io/projected/00970c7b-2728-489b-a73d-cae6bd1ee00b-kube-api-access-f45ct\") on node \"crc\" DevicePath \"\"" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.867721 4704 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79f4af01-528d-4b01-9f40-b3942bc9c9d8-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.867774 4704 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/79f4af01-528d-4b01-9f40-b3942bc9c9d8-client-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.867878 4704 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79f4af01-528d-4b01-9f40-b3942bc9c9d8-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.868001 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hplrb\" (UniqueName: \"kubernetes.io/projected/79f4af01-528d-4b01-9f40-b3942bc9c9d8-kube-api-access-hplrb\") on node \"crc\" DevicePath \"\"" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.868068 4704 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00970c7b-2728-489b-a73d-cae6bd1ee00b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.868136 4704 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00970c7b-2728-489b-a73d-cae6bd1ee00b-client-ca\") on node \"crc\" DevicePath \"\"" Nov 25 15:47:48 crc kubenswrapper[4704]: I1125 15:47:48.868196 4704 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00970c7b-2728-489b-a73d-cae6bd1ee00b-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.748613 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-675fd74d47-mjcln" Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.748668 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85cff65667-2bp8b" Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.790717 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-649b7dd58-b8wfk"] Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.791413 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-649b7dd58-b8wfk" Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.793853 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.793934 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.793972 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.799361 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.802781 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-675fd74d47-mjcln"] Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.800279 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.802111 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.823672 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-675fd74d47-mjcln"] Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.827410 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.831924 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-649b7dd58-b8wfk"] Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.840629 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85cff65667-2bp8b"] Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.844197 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85cff65667-2bp8b"] Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.882637 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4-proxy-ca-bundles\") pod \"controller-manager-649b7dd58-b8wfk\" (UID: \"9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4\") " pod="openshift-controller-manager/controller-manager-649b7dd58-b8wfk" Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.883051 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4-client-ca\") pod \"controller-manager-649b7dd58-b8wfk\" (UID: \"9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4\") " pod="openshift-controller-manager/controller-manager-649b7dd58-b8wfk" Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.883197 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jrc5\" (UniqueName: \"kubernetes.io/projected/9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4-kube-api-access-6jrc5\") pod \"controller-manager-649b7dd58-b8wfk\" (UID: \"9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4\") " pod="openshift-controller-manager/controller-manager-649b7dd58-b8wfk" Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.883324 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4-config\") pod \"controller-manager-649b7dd58-b8wfk\" (UID: \"9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4\") " pod="openshift-controller-manager/controller-manager-649b7dd58-b8wfk" Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.883471 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4-serving-cert\") pod \"controller-manager-649b7dd58-b8wfk\" (UID: \"9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4\") " pod="openshift-controller-manager/controller-manager-649b7dd58-b8wfk" Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.984849 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jrc5\" (UniqueName: \"kubernetes.io/projected/9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4-kube-api-access-6jrc5\") pod \"controller-manager-649b7dd58-b8wfk\" (UID: \"9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4\") " pod="openshift-controller-manager/controller-manager-649b7dd58-b8wfk" Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.985195 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4-config\") pod \"controller-manager-649b7dd58-b8wfk\" (UID: \"9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4\") " pod="openshift-controller-manager/controller-manager-649b7dd58-b8wfk" Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.985362 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4-serving-cert\") pod \"controller-manager-649b7dd58-b8wfk\" (UID: \"9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4\") " pod="openshift-controller-manager/controller-manager-649b7dd58-b8wfk" Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.985467 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4-proxy-ca-bundles\") pod \"controller-manager-649b7dd58-b8wfk\" (UID: \"9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4\") " pod="openshift-controller-manager/controller-manager-649b7dd58-b8wfk" Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.985551 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4-client-ca\") pod \"controller-manager-649b7dd58-b8wfk\" (UID: \"9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4\") " pod="openshift-controller-manager/controller-manager-649b7dd58-b8wfk" Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.986531 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4-client-ca\") pod \"controller-manager-649b7dd58-b8wfk\" (UID: \"9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4\") " pod="openshift-controller-manager/controller-manager-649b7dd58-b8wfk" Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.987144 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4-config\") pod \"controller-manager-649b7dd58-b8wfk\" (UID: \"9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4\") " pod="openshift-controller-manager/controller-manager-649b7dd58-b8wfk" Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.987637 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4-proxy-ca-bundles\") pod \"controller-manager-649b7dd58-b8wfk\" (UID: \"9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4\") " pod="openshift-controller-manager/controller-manager-649b7dd58-b8wfk" Nov 25 15:47:49 crc kubenswrapper[4704]: I1125 15:47:49.990951 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4-serving-cert\") pod \"controller-manager-649b7dd58-b8wfk\" (UID: \"9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4\") " pod="openshift-controller-manager/controller-manager-649b7dd58-b8wfk" Nov 25 15:47:50 crc kubenswrapper[4704]: I1125 15:47:50.007824 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jrc5\" (UniqueName: \"kubernetes.io/projected/9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4-kube-api-access-6jrc5\") pod \"controller-manager-649b7dd58-b8wfk\" (UID: \"9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4\") " pod="openshift-controller-manager/controller-manager-649b7dd58-b8wfk" Nov 25 15:47:50 crc kubenswrapper[4704]: I1125 15:47:50.125286 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-649b7dd58-b8wfk" Nov 25 15:47:50 crc kubenswrapper[4704]: I1125 15:47:50.356191 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-649b7dd58-b8wfk"] Nov 25 15:47:50 crc kubenswrapper[4704]: I1125 15:47:50.424480 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00970c7b-2728-489b-a73d-cae6bd1ee00b" path="/var/lib/kubelet/pods/00970c7b-2728-489b-a73d-cae6bd1ee00b/volumes" Nov 25 15:47:50 crc kubenswrapper[4704]: I1125 15:47:50.425662 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79f4af01-528d-4b01-9f40-b3942bc9c9d8" path="/var/lib/kubelet/pods/79f4af01-528d-4b01-9f40-b3942bc9c9d8/volumes" Nov 25 15:47:50 crc kubenswrapper[4704]: I1125 15:47:50.756964 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-649b7dd58-b8wfk" event={"ID":"9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4","Type":"ContainerStarted","Data":"cadda47e444b3e14fd0c75927eb5ee167ad519f89fc993d366b669c3fb8ec148"} Nov 25 15:47:50 crc kubenswrapper[4704]: I1125 15:47:50.757028 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-649b7dd58-b8wfk" event={"ID":"9bcc2a51-f671-45f8-b5b4-ae2f2fe2c9f4","Type":"ContainerStarted","Data":"d5facc73f506982e5faa1d8a06d43d6a2d3ac61769430c03fc7eed7fd451c18b"} Nov 25 15:47:50 crc kubenswrapper[4704]: I1125 15:47:50.757306 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-649b7dd58-b8wfk" Nov 25 15:47:50 crc kubenswrapper[4704]: I1125 15:47:50.766208 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-649b7dd58-b8wfk" Nov 25 15:47:50 crc kubenswrapper[4704]: I1125 15:47:50.795441 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-649b7dd58-b8wfk" podStartSLOduration=2.795415646 podStartE2EDuration="2.795415646s" podCreationTimestamp="2025-11-25 15:47:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:47:50.77992533 +0000 UTC m=+757.048199121" watchObservedRunningTime="2025-11-25 15:47:50.795415646 +0000 UTC m=+757.063689427" Nov 25 15:47:52 crc kubenswrapper[4704]: I1125 15:47:52.467891 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f6b89dfb-jzc6t"] Nov 25 15:47:52 crc kubenswrapper[4704]: I1125 15:47:52.469126 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5f6b89dfb-jzc6t" Nov 25 15:47:52 crc kubenswrapper[4704]: I1125 15:47:52.471470 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 25 15:47:52 crc kubenswrapper[4704]: I1125 15:47:52.471851 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 25 15:47:52 crc kubenswrapper[4704]: I1125 15:47:52.471926 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 25 15:47:52 crc kubenswrapper[4704]: I1125 15:47:52.472480 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 25 15:47:52 crc kubenswrapper[4704]: I1125 15:47:52.472850 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 25 15:47:52 crc kubenswrapper[4704]: I1125 15:47:52.473667 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 25 15:47:52 crc kubenswrapper[4704]: I1125 15:47:52.479604 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f6b89dfb-jzc6t"] Nov 25 15:47:52 crc kubenswrapper[4704]: I1125 15:47:52.541990 4704 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 25 15:47:52 crc kubenswrapper[4704]: I1125 15:47:52.619161 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e717cc12-08e3-40b5-ab77-b2b2dc2ff821-config\") pod \"route-controller-manager-5f6b89dfb-jzc6t\" (UID: \"e717cc12-08e3-40b5-ab77-b2b2dc2ff821\") " pod="openshift-route-controller-manager/route-controller-manager-5f6b89dfb-jzc6t" Nov 25 15:47:52 crc kubenswrapper[4704]: I1125 15:47:52.619222 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e717cc12-08e3-40b5-ab77-b2b2dc2ff821-client-ca\") pod \"route-controller-manager-5f6b89dfb-jzc6t\" (UID: \"e717cc12-08e3-40b5-ab77-b2b2dc2ff821\") " pod="openshift-route-controller-manager/route-controller-manager-5f6b89dfb-jzc6t" Nov 25 15:47:52 crc kubenswrapper[4704]: I1125 15:47:52.619768 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmvw5\" (UniqueName: \"kubernetes.io/projected/e717cc12-08e3-40b5-ab77-b2b2dc2ff821-kube-api-access-dmvw5\") pod \"route-controller-manager-5f6b89dfb-jzc6t\" (UID: \"e717cc12-08e3-40b5-ab77-b2b2dc2ff821\") " pod="openshift-route-controller-manager/route-controller-manager-5f6b89dfb-jzc6t" Nov 25 15:47:52 crc kubenswrapper[4704]: I1125 15:47:52.619871 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e717cc12-08e3-40b5-ab77-b2b2dc2ff821-serving-cert\") pod \"route-controller-manager-5f6b89dfb-jzc6t\" (UID: \"e717cc12-08e3-40b5-ab77-b2b2dc2ff821\") " pod="openshift-route-controller-manager/route-controller-manager-5f6b89dfb-jzc6t" Nov 25 15:47:52 crc kubenswrapper[4704]: I1125 15:47:52.725181 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e717cc12-08e3-40b5-ab77-b2b2dc2ff821-config\") pod \"route-controller-manager-5f6b89dfb-jzc6t\" (UID: \"e717cc12-08e3-40b5-ab77-b2b2dc2ff821\") " pod="openshift-route-controller-manager/route-controller-manager-5f6b89dfb-jzc6t" Nov 25 15:47:52 crc kubenswrapper[4704]: I1125 15:47:52.726636 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e717cc12-08e3-40b5-ab77-b2b2dc2ff821-config\") pod \"route-controller-manager-5f6b89dfb-jzc6t\" (UID: \"e717cc12-08e3-40b5-ab77-b2b2dc2ff821\") " pod="openshift-route-controller-manager/route-controller-manager-5f6b89dfb-jzc6t" Nov 25 15:47:52 crc kubenswrapper[4704]: I1125 15:47:52.734731 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e717cc12-08e3-40b5-ab77-b2b2dc2ff821-client-ca\") pod \"route-controller-manager-5f6b89dfb-jzc6t\" (UID: \"e717cc12-08e3-40b5-ab77-b2b2dc2ff821\") " pod="openshift-route-controller-manager/route-controller-manager-5f6b89dfb-jzc6t" Nov 25 15:47:52 crc kubenswrapper[4704]: I1125 15:47:52.735523 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmvw5\" (UniqueName: \"kubernetes.io/projected/e717cc12-08e3-40b5-ab77-b2b2dc2ff821-kube-api-access-dmvw5\") pod \"route-controller-manager-5f6b89dfb-jzc6t\" (UID: \"e717cc12-08e3-40b5-ab77-b2b2dc2ff821\") " pod="openshift-route-controller-manager/route-controller-manager-5f6b89dfb-jzc6t" Nov 25 15:47:52 crc kubenswrapper[4704]: I1125 15:47:52.735636 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e717cc12-08e3-40b5-ab77-b2b2dc2ff821-serving-cert\") pod \"route-controller-manager-5f6b89dfb-jzc6t\" (UID: \"e717cc12-08e3-40b5-ab77-b2b2dc2ff821\") " pod="openshift-route-controller-manager/route-controller-manager-5f6b89dfb-jzc6t" Nov 25 15:47:52 crc kubenswrapper[4704]: I1125 15:47:52.735889 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e717cc12-08e3-40b5-ab77-b2b2dc2ff821-client-ca\") pod \"route-controller-manager-5f6b89dfb-jzc6t\" (UID: \"e717cc12-08e3-40b5-ab77-b2b2dc2ff821\") " pod="openshift-route-controller-manager/route-controller-manager-5f6b89dfb-jzc6t" Nov 25 15:47:52 crc kubenswrapper[4704]: I1125 15:47:52.754930 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e717cc12-08e3-40b5-ab77-b2b2dc2ff821-serving-cert\") pod \"route-controller-manager-5f6b89dfb-jzc6t\" (UID: \"e717cc12-08e3-40b5-ab77-b2b2dc2ff821\") " pod="openshift-route-controller-manager/route-controller-manager-5f6b89dfb-jzc6t" Nov 25 15:47:52 crc kubenswrapper[4704]: I1125 15:47:52.759551 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmvw5\" (UniqueName: \"kubernetes.io/projected/e717cc12-08e3-40b5-ab77-b2b2dc2ff821-kube-api-access-dmvw5\") pod \"route-controller-manager-5f6b89dfb-jzc6t\" (UID: \"e717cc12-08e3-40b5-ab77-b2b2dc2ff821\") " pod="openshift-route-controller-manager/route-controller-manager-5f6b89dfb-jzc6t" Nov 25 15:47:52 crc kubenswrapper[4704]: I1125 15:47:52.790417 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5f6b89dfb-jzc6t" Nov 25 15:47:53 crc kubenswrapper[4704]: I1125 15:47:53.257395 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f6b89dfb-jzc6t"] Nov 25 15:47:53 crc kubenswrapper[4704]: I1125 15:47:53.776601 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5f6b89dfb-jzc6t" event={"ID":"e717cc12-08e3-40b5-ab77-b2b2dc2ff821","Type":"ContainerStarted","Data":"cdce7c75283ef8301bd8b2abe190a8d2f40f4e32ca8ffde6b18f91daa0d29380"} Nov 25 15:47:53 crc kubenswrapper[4704]: I1125 15:47:53.776674 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5f6b89dfb-jzc6t" event={"ID":"e717cc12-08e3-40b5-ab77-b2b2dc2ff821","Type":"ContainerStarted","Data":"5fb2d7667ae32bbee1676e5f19ea166c58d7d632c16b4038d6c6ed9b28fcd343"} Nov 25 15:47:53 crc kubenswrapper[4704]: I1125 15:47:53.777331 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5f6b89dfb-jzc6t" Nov 25 15:47:53 crc kubenswrapper[4704]: I1125 15:47:53.785150 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5f6b89dfb-jzc6t" Nov 25 15:47:53 crc kubenswrapper[4704]: I1125 15:47:53.799643 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5f6b89dfb-jzc6t" podStartSLOduration=5.7996213359999995 podStartE2EDuration="5.799621336s" podCreationTimestamp="2025-11-25 15:47:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:47:53.798269578 +0000 UTC m=+760.066543359" watchObservedRunningTime="2025-11-25 15:47:53.799621336 +0000 UTC m=+760.067895117" Nov 25 15:47:58 crc kubenswrapper[4704]: I1125 15:47:58.099555 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-65c6dc9bcf-5m5qc" Nov 25 15:48:17 crc kubenswrapper[4704]: I1125 15:48:17.819483 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5f4957f9b7-lcxrs" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.560122 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-l8hl8"] Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.561141 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-l8hl8" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.566087 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-dllmv"] Nov 25 15:48:18 crc kubenswrapper[4704]: W1125 15:48:18.566103 4704 reflector.go:561] object-"metallb-system"/"frr-k8s-webhook-server-cert": failed to list *v1.Secret: secrets "frr-k8s-webhook-server-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Nov 25 15:48:18 crc kubenswrapper[4704]: E1125 15:48:18.566259 4704 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"frr-k8s-webhook-server-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"frr-k8s-webhook-server-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 25 15:48:18 crc kubenswrapper[4704]: W1125 15:48:18.566179 4704 reflector.go:561] object-"metallb-system"/"frr-k8s-daemon-dockercfg-889k8": failed to list *v1.Secret: secrets "frr-k8s-daemon-dockercfg-889k8" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Nov 25 15:48:18 crc kubenswrapper[4704]: E1125 15:48:18.566299 4704 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"frr-k8s-daemon-dockercfg-889k8\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"frr-k8s-daemon-dockercfg-889k8\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.568764 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-dllmv" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.574670 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.574773 4704 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.616501 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-l8hl8"] Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.662362 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/29b053f9-2ac9-4eeb-bb2b-adbe17dfab59-frr-startup\") pod \"frr-k8s-dllmv\" (UID: \"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59\") " pod="metallb-system/frr-k8s-dllmv" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.662405 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/29b053f9-2ac9-4eeb-bb2b-adbe17dfab59-frr-conf\") pod \"frr-k8s-dllmv\" (UID: \"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59\") " pod="metallb-system/frr-k8s-dllmv" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.662455 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/29b053f9-2ac9-4eeb-bb2b-adbe17dfab59-metrics\") pod \"frr-k8s-dllmv\" (UID: \"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59\") " pod="metallb-system/frr-k8s-dllmv" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.662588 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ebf01769-76d3-4f64-bd68-f20add5a1266-cert\") pod \"frr-k8s-webhook-server-6998585d5-l8hl8\" (UID: \"ebf01769-76d3-4f64-bd68-f20add5a1266\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-l8hl8" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.662637 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgntz\" (UniqueName: \"kubernetes.io/projected/ebf01769-76d3-4f64-bd68-f20add5a1266-kube-api-access-wgntz\") pod \"frr-k8s-webhook-server-6998585d5-l8hl8\" (UID: \"ebf01769-76d3-4f64-bd68-f20add5a1266\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-l8hl8" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.662742 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/29b053f9-2ac9-4eeb-bb2b-adbe17dfab59-reloader\") pod \"frr-k8s-dllmv\" (UID: \"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59\") " pod="metallb-system/frr-k8s-dllmv" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.662844 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7r6q\" (UniqueName: \"kubernetes.io/projected/29b053f9-2ac9-4eeb-bb2b-adbe17dfab59-kube-api-access-n7r6q\") pod \"frr-k8s-dllmv\" (UID: \"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59\") " pod="metallb-system/frr-k8s-dllmv" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.662871 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29b053f9-2ac9-4eeb-bb2b-adbe17dfab59-metrics-certs\") pod \"frr-k8s-dllmv\" (UID: \"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59\") " pod="metallb-system/frr-k8s-dllmv" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.662901 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/29b053f9-2ac9-4eeb-bb2b-adbe17dfab59-frr-sockets\") pod \"frr-k8s-dllmv\" (UID: \"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59\") " pod="metallb-system/frr-k8s-dllmv" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.664666 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6c7b4b5f48-2bp8m"] Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.665608 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-2bp8m" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.668459 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-lbpq9"] Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.669427 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-lbpq9" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.672599 4704 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.673103 4704 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.673777 4704 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.674411 4704 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-4b5nw" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.675167 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.735528 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-2bp8m"] Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.763503 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7r6q\" (UniqueName: \"kubernetes.io/projected/29b053f9-2ac9-4eeb-bb2b-adbe17dfab59-kube-api-access-n7r6q\") pod \"frr-k8s-dllmv\" (UID: \"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59\") " pod="metallb-system/frr-k8s-dllmv" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.764327 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e21f57a-b980-4bb4-8367-b2c3216b5e17-metrics-certs\") pod \"controller-6c7b4b5f48-2bp8m\" (UID: \"3e21f57a-b980-4bb4-8367-b2c3216b5e17\") " pod="metallb-system/controller-6c7b4b5f48-2bp8m" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.764599 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29b053f9-2ac9-4eeb-bb2b-adbe17dfab59-metrics-certs\") pod \"frr-k8s-dllmv\" (UID: \"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59\") " pod="metallb-system/frr-k8s-dllmv" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.764733 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88n9d\" (UniqueName: \"kubernetes.io/projected/3e21f57a-b980-4bb4-8367-b2c3216b5e17-kube-api-access-88n9d\") pod \"controller-6c7b4b5f48-2bp8m\" (UID: \"3e21f57a-b980-4bb4-8367-b2c3216b5e17\") " pod="metallb-system/controller-6c7b4b5f48-2bp8m" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.764877 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/29b053f9-2ac9-4eeb-bb2b-adbe17dfab59-frr-sockets\") pod \"frr-k8s-dllmv\" (UID: \"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59\") " pod="metallb-system/frr-k8s-dllmv" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.764997 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0f413579-b97a-443b-8487-b1424b1e5a4e-memberlist\") pod \"speaker-lbpq9\" (UID: \"0f413579-b97a-443b-8487-b1424b1e5a4e\") " pod="metallb-system/speaker-lbpq9" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.765142 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs4ng\" (UniqueName: \"kubernetes.io/projected/0f413579-b97a-443b-8487-b1424b1e5a4e-kube-api-access-xs4ng\") pod \"speaker-lbpq9\" (UID: \"0f413579-b97a-443b-8487-b1424b1e5a4e\") " pod="metallb-system/speaker-lbpq9" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.765291 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e21f57a-b980-4bb4-8367-b2c3216b5e17-cert\") pod \"controller-6c7b4b5f48-2bp8m\" (UID: \"3e21f57a-b980-4bb4-8367-b2c3216b5e17\") " pod="metallb-system/controller-6c7b4b5f48-2bp8m" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.765412 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/29b053f9-2ac9-4eeb-bb2b-adbe17dfab59-frr-startup\") pod \"frr-k8s-dllmv\" (UID: \"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59\") " pod="metallb-system/frr-k8s-dllmv" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.765484 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/29b053f9-2ac9-4eeb-bb2b-adbe17dfab59-frr-sockets\") pod \"frr-k8s-dllmv\" (UID: \"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59\") " pod="metallb-system/frr-k8s-dllmv" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.765508 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/29b053f9-2ac9-4eeb-bb2b-adbe17dfab59-frr-conf\") pod \"frr-k8s-dllmv\" (UID: \"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59\") " pod="metallb-system/frr-k8s-dllmv" Nov 25 15:48:18 crc kubenswrapper[4704]: E1125 15:48:18.764889 4704 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Nov 25 15:48:18 crc kubenswrapper[4704]: E1125 15:48:18.765731 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29b053f9-2ac9-4eeb-bb2b-adbe17dfab59-metrics-certs podName:29b053f9-2ac9-4eeb-bb2b-adbe17dfab59 nodeName:}" failed. No retries permitted until 2025-11-25 15:48:19.265706411 +0000 UTC m=+785.533980192 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/29b053f9-2ac9-4eeb-bb2b-adbe17dfab59-metrics-certs") pod "frr-k8s-dllmv" (UID: "29b053f9-2ac9-4eeb-bb2b-adbe17dfab59") : secret "frr-k8s-certs-secret" not found Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.765627 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0f413579-b97a-443b-8487-b1424b1e5a4e-metallb-excludel2\") pod \"speaker-lbpq9\" (UID: \"0f413579-b97a-443b-8487-b1424b1e5a4e\") " pod="metallb-system/speaker-lbpq9" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.765902 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/29b053f9-2ac9-4eeb-bb2b-adbe17dfab59-metrics\") pod \"frr-k8s-dllmv\" (UID: \"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59\") " pod="metallb-system/frr-k8s-dllmv" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.766004 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ebf01769-76d3-4f64-bd68-f20add5a1266-cert\") pod \"frr-k8s-webhook-server-6998585d5-l8hl8\" (UID: \"ebf01769-76d3-4f64-bd68-f20add5a1266\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-l8hl8" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.766090 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgntz\" (UniqueName: \"kubernetes.io/projected/ebf01769-76d3-4f64-bd68-f20add5a1266-kube-api-access-wgntz\") pod \"frr-k8s-webhook-server-6998585d5-l8hl8\" (UID: \"ebf01769-76d3-4f64-bd68-f20add5a1266\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-l8hl8" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.766180 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/29b053f9-2ac9-4eeb-bb2b-adbe17dfab59-reloader\") pod \"frr-k8s-dllmv\" (UID: \"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59\") " pod="metallb-system/frr-k8s-dllmv" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.766240 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/29b053f9-2ac9-4eeb-bb2b-adbe17dfab59-metrics\") pod \"frr-k8s-dllmv\" (UID: \"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59\") " pod="metallb-system/frr-k8s-dllmv" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.766256 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f413579-b97a-443b-8487-b1424b1e5a4e-metrics-certs\") pod \"speaker-lbpq9\" (UID: \"0f413579-b97a-443b-8487-b1424b1e5a4e\") " pod="metallb-system/speaker-lbpq9" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.766508 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/29b053f9-2ac9-4eeb-bb2b-adbe17dfab59-reloader\") pod \"frr-k8s-dllmv\" (UID: \"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59\") " pod="metallb-system/frr-k8s-dllmv" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.766629 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/29b053f9-2ac9-4eeb-bb2b-adbe17dfab59-frr-startup\") pod \"frr-k8s-dllmv\" (UID: \"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59\") " pod="metallb-system/frr-k8s-dllmv" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.767015 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/29b053f9-2ac9-4eeb-bb2b-adbe17dfab59-frr-conf\") pod \"frr-k8s-dllmv\" (UID: \"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59\") " pod="metallb-system/frr-k8s-dllmv" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.796984 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7r6q\" (UniqueName: \"kubernetes.io/projected/29b053f9-2ac9-4eeb-bb2b-adbe17dfab59-kube-api-access-n7r6q\") pod \"frr-k8s-dllmv\" (UID: \"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59\") " pod="metallb-system/frr-k8s-dllmv" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.797032 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgntz\" (UniqueName: \"kubernetes.io/projected/ebf01769-76d3-4f64-bd68-f20add5a1266-kube-api-access-wgntz\") pod \"frr-k8s-webhook-server-6998585d5-l8hl8\" (UID: \"ebf01769-76d3-4f64-bd68-f20add5a1266\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-l8hl8" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.867938 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0f413579-b97a-443b-8487-b1424b1e5a4e-metallb-excludel2\") pod \"speaker-lbpq9\" (UID: \"0f413579-b97a-443b-8487-b1424b1e5a4e\") " pod="metallb-system/speaker-lbpq9" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.868969 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0f413579-b97a-443b-8487-b1424b1e5a4e-metallb-excludel2\") pod \"speaker-lbpq9\" (UID: \"0f413579-b97a-443b-8487-b1424b1e5a4e\") " pod="metallb-system/speaker-lbpq9" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.869115 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f413579-b97a-443b-8487-b1424b1e5a4e-metrics-certs\") pod \"speaker-lbpq9\" (UID: \"0f413579-b97a-443b-8487-b1424b1e5a4e\") " pod="metallb-system/speaker-lbpq9" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.869200 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e21f57a-b980-4bb4-8367-b2c3216b5e17-metrics-certs\") pod \"controller-6c7b4b5f48-2bp8m\" (UID: \"3e21f57a-b980-4bb4-8367-b2c3216b5e17\") " pod="metallb-system/controller-6c7b4b5f48-2bp8m" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.869241 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88n9d\" (UniqueName: \"kubernetes.io/projected/3e21f57a-b980-4bb4-8367-b2c3216b5e17-kube-api-access-88n9d\") pod \"controller-6c7b4b5f48-2bp8m\" (UID: \"3e21f57a-b980-4bb4-8367-b2c3216b5e17\") " pod="metallb-system/controller-6c7b4b5f48-2bp8m" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.869274 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0f413579-b97a-443b-8487-b1424b1e5a4e-memberlist\") pod \"speaker-lbpq9\" (UID: \"0f413579-b97a-443b-8487-b1424b1e5a4e\") " pod="metallb-system/speaker-lbpq9" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.869298 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs4ng\" (UniqueName: \"kubernetes.io/projected/0f413579-b97a-443b-8487-b1424b1e5a4e-kube-api-access-xs4ng\") pod \"speaker-lbpq9\" (UID: \"0f413579-b97a-443b-8487-b1424b1e5a4e\") " pod="metallb-system/speaker-lbpq9" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.869315 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e21f57a-b980-4bb4-8367-b2c3216b5e17-cert\") pod \"controller-6c7b4b5f48-2bp8m\" (UID: \"3e21f57a-b980-4bb4-8367-b2c3216b5e17\") " pod="metallb-system/controller-6c7b4b5f48-2bp8m" Nov 25 15:48:18 crc kubenswrapper[4704]: E1125 15:48:18.870036 4704 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 25 15:48:18 crc kubenswrapper[4704]: E1125 15:48:18.870128 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f413579-b97a-443b-8487-b1424b1e5a4e-memberlist podName:0f413579-b97a-443b-8487-b1424b1e5a4e nodeName:}" failed. No retries permitted until 2025-11-25 15:48:19.370100852 +0000 UTC m=+785.638374633 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/0f413579-b97a-443b-8487-b1424b1e5a4e-memberlist") pod "speaker-lbpq9" (UID: "0f413579-b97a-443b-8487-b1424b1e5a4e") : secret "metallb-memberlist" not found Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.873545 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f413579-b97a-443b-8487-b1424b1e5a4e-metrics-certs\") pod \"speaker-lbpq9\" (UID: \"0f413579-b97a-443b-8487-b1424b1e5a4e\") " pod="metallb-system/speaker-lbpq9" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.874089 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e21f57a-b980-4bb4-8367-b2c3216b5e17-metrics-certs\") pod \"controller-6c7b4b5f48-2bp8m\" (UID: \"3e21f57a-b980-4bb4-8367-b2c3216b5e17\") " pod="metallb-system/controller-6c7b4b5f48-2bp8m" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.875025 4704 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.884208 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e21f57a-b980-4bb4-8367-b2c3216b5e17-cert\") pod \"controller-6c7b4b5f48-2bp8m\" (UID: \"3e21f57a-b980-4bb4-8367-b2c3216b5e17\") " pod="metallb-system/controller-6c7b4b5f48-2bp8m" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.900185 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs4ng\" (UniqueName: \"kubernetes.io/projected/0f413579-b97a-443b-8487-b1424b1e5a4e-kube-api-access-xs4ng\") pod \"speaker-lbpq9\" (UID: \"0f413579-b97a-443b-8487-b1424b1e5a4e\") " pod="metallb-system/speaker-lbpq9" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.900510 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88n9d\" (UniqueName: \"kubernetes.io/projected/3e21f57a-b980-4bb4-8367-b2c3216b5e17-kube-api-access-88n9d\") pod \"controller-6c7b4b5f48-2bp8m\" (UID: \"3e21f57a-b980-4bb4-8367-b2c3216b5e17\") " pod="metallb-system/controller-6c7b4b5f48-2bp8m" Nov 25 15:48:18 crc kubenswrapper[4704]: I1125 15:48:18.980183 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-2bp8m" Nov 25 15:48:19 crc kubenswrapper[4704]: E1125 15:48:19.855673 4704 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: failed to sync secret cache: timed out waiting for the condition Nov 25 15:48:19 crc kubenswrapper[4704]: E1125 15:48:19.856188 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebf01769-76d3-4f64-bd68-f20add5a1266-cert podName:ebf01769-76d3-4f64-bd68-f20add5a1266 nodeName:}" failed. No retries permitted until 2025-11-25 15:48:20.356163191 +0000 UTC m=+786.624437122 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ebf01769-76d3-4f64-bd68-f20add5a1266-cert") pod "frr-k8s-webhook-server-6998585d5-l8hl8" (UID: "ebf01769-76d3-4f64-bd68-f20add5a1266") : failed to sync secret cache: timed out waiting for the condition Nov 25 15:48:19 crc kubenswrapper[4704]: I1125 15:48:19.855809 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29b053f9-2ac9-4eeb-bb2b-adbe17dfab59-metrics-certs\") pod \"frr-k8s-dllmv\" (UID: \"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59\") " pod="metallb-system/frr-k8s-dllmv" Nov 25 15:48:19 crc kubenswrapper[4704]: I1125 15:48:19.857328 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0f413579-b97a-443b-8487-b1424b1e5a4e-memberlist\") pod \"speaker-lbpq9\" (UID: \"0f413579-b97a-443b-8487-b1424b1e5a4e\") " pod="metallb-system/speaker-lbpq9" Nov 25 15:48:19 crc kubenswrapper[4704]: E1125 15:48:19.857967 4704 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 25 15:48:19 crc kubenswrapper[4704]: E1125 15:48:19.858011 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f413579-b97a-443b-8487-b1424b1e5a4e-memberlist podName:0f413579-b97a-443b-8487-b1424b1e5a4e nodeName:}" failed. No retries permitted until 2025-11-25 15:48:20.857997374 +0000 UTC m=+787.126271155 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/0f413579-b97a-443b-8487-b1424b1e5a4e-memberlist") pod "speaker-lbpq9" (UID: "0f413579-b97a-443b-8487-b1424b1e5a4e") : secret "metallb-memberlist" not found Nov 25 15:48:19 crc kubenswrapper[4704]: I1125 15:48:19.862481 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29b053f9-2ac9-4eeb-bb2b-adbe17dfab59-metrics-certs\") pod \"frr-k8s-dllmv\" (UID: \"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59\") " pod="metallb-system/frr-k8s-dllmv" Nov 25 15:48:19 crc kubenswrapper[4704]: I1125 15:48:19.884272 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-2bp8m"] Nov 25 15:48:19 crc kubenswrapper[4704]: I1125 15:48:19.906637 4704 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 25 15:48:19 crc kubenswrapper[4704]: I1125 15:48:19.930415 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-2bp8m" event={"ID":"3e21f57a-b980-4bb4-8367-b2c3216b5e17","Type":"ContainerStarted","Data":"1cd1e33dfdd6fa85b1fa4908dbfaac5c7cc58300f20b7b2a658a3c61e78e7068"} Nov 25 15:48:20 crc kubenswrapper[4704]: I1125 15:48:20.149295 4704 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-889k8" Nov 25 15:48:20 crc kubenswrapper[4704]: I1125 15:48:20.149563 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-dllmv" Nov 25 15:48:20 crc kubenswrapper[4704]: I1125 15:48:20.366278 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ebf01769-76d3-4f64-bd68-f20add5a1266-cert\") pod \"frr-k8s-webhook-server-6998585d5-l8hl8\" (UID: \"ebf01769-76d3-4f64-bd68-f20add5a1266\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-l8hl8" Nov 25 15:48:20 crc kubenswrapper[4704]: I1125 15:48:20.373458 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ebf01769-76d3-4f64-bd68-f20add5a1266-cert\") pod \"frr-k8s-webhook-server-6998585d5-l8hl8\" (UID: \"ebf01769-76d3-4f64-bd68-f20add5a1266\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-l8hl8" Nov 25 15:48:20 crc kubenswrapper[4704]: I1125 15:48:20.381906 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-l8hl8" Nov 25 15:48:20 crc kubenswrapper[4704]: I1125 15:48:20.774059 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-l8hl8"] Nov 25 15:48:20 crc kubenswrapper[4704]: W1125 15:48:20.778633 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebf01769_76d3_4f64_bd68_f20add5a1266.slice/crio-7087e6e2bc38625138f3ad4aa8b224e2d3cc87eba34f2c1c98ec86d84cbe71fa WatchSource:0}: Error finding container 7087e6e2bc38625138f3ad4aa8b224e2d3cc87eba34f2c1c98ec86d84cbe71fa: Status 404 returned error can't find the container with id 7087e6e2bc38625138f3ad4aa8b224e2d3cc87eba34f2c1c98ec86d84cbe71fa Nov 25 15:48:20 crc kubenswrapper[4704]: I1125 15:48:20.871685 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0f413579-b97a-443b-8487-b1424b1e5a4e-memberlist\") pod \"speaker-lbpq9\" (UID: \"0f413579-b97a-443b-8487-b1424b1e5a4e\") " pod="metallb-system/speaker-lbpq9" Nov 25 15:48:20 crc kubenswrapper[4704]: I1125 15:48:20.879078 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0f413579-b97a-443b-8487-b1424b1e5a4e-memberlist\") pod \"speaker-lbpq9\" (UID: \"0f413579-b97a-443b-8487-b1424b1e5a4e\") " pod="metallb-system/speaker-lbpq9" Nov 25 15:48:20 crc kubenswrapper[4704]: I1125 15:48:20.937981 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-2bp8m" event={"ID":"3e21f57a-b980-4bb4-8367-b2c3216b5e17","Type":"ContainerStarted","Data":"bb31702cc6126231344dc2c4150b91613da5e52239c57a0ed233f8a51432a835"} Nov 25 15:48:20 crc kubenswrapper[4704]: I1125 15:48:20.939045 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dllmv" event={"ID":"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59","Type":"ContainerStarted","Data":"765a7b9dbb2ada4aef142ed1cb8c3a832dddb2baf6a6d646a3fe117ba24f28f6"} Nov 25 15:48:20 crc kubenswrapper[4704]: I1125 15:48:20.940013 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-l8hl8" event={"ID":"ebf01769-76d3-4f64-bd68-f20add5a1266","Type":"ContainerStarted","Data":"7087e6e2bc38625138f3ad4aa8b224e2d3cc87eba34f2c1c98ec86d84cbe71fa"} Nov 25 15:48:21 crc kubenswrapper[4704]: I1125 15:48:21.088275 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-lbpq9" Nov 25 15:48:21 crc kubenswrapper[4704]: W1125 15:48:21.152828 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f413579_b97a_443b_8487_b1424b1e5a4e.slice/crio-4929e368a7d6fd13a9238296444b6c0d730397a1bead67e62ce3ba6b43e831b4 WatchSource:0}: Error finding container 4929e368a7d6fd13a9238296444b6c0d730397a1bead67e62ce3ba6b43e831b4: Status 404 returned error can't find the container with id 4929e368a7d6fd13a9238296444b6c0d730397a1bead67e62ce3ba6b43e831b4 Nov 25 15:48:21 crc kubenswrapper[4704]: I1125 15:48:21.954387 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-lbpq9" event={"ID":"0f413579-b97a-443b-8487-b1424b1e5a4e","Type":"ContainerStarted","Data":"44c817baad59b986952963eb8077b79d5c752c5f8aeb8dc117ee9129aab51eb9"} Nov 25 15:48:21 crc kubenswrapper[4704]: I1125 15:48:21.954449 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-lbpq9" event={"ID":"0f413579-b97a-443b-8487-b1424b1e5a4e","Type":"ContainerStarted","Data":"4929e368a7d6fd13a9238296444b6c0d730397a1bead67e62ce3ba6b43e831b4"} Nov 25 15:48:23 crc kubenswrapper[4704]: I1125 15:48:23.966030 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-2bp8m" event={"ID":"3e21f57a-b980-4bb4-8367-b2c3216b5e17","Type":"ContainerStarted","Data":"5cd396862e4089c9ca21f95402d25ee416aaf03fa0b8c49508f7bba33987ffcf"} Nov 25 15:48:23 crc kubenswrapper[4704]: I1125 15:48:23.967780 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6c7b4b5f48-2bp8m" Nov 25 15:48:23 crc kubenswrapper[4704]: I1125 15:48:23.970315 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-lbpq9" event={"ID":"0f413579-b97a-443b-8487-b1424b1e5a4e","Type":"ContainerStarted","Data":"d4a366ffc6062efa99c76640f0a729ad8444b5136e9ebcc0b6aa12ebe2b6aace"} Nov 25 15:48:23 crc kubenswrapper[4704]: I1125 15:48:23.970885 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-lbpq9" Nov 25 15:48:23 crc kubenswrapper[4704]: I1125 15:48:23.984463 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6c7b4b5f48-2bp8m" podStartSLOduration=2.509437797 podStartE2EDuration="5.98444387s" podCreationTimestamp="2025-11-25 15:48:18 +0000 UTC" firstStartedPulling="2025-11-25 15:48:20.067002272 +0000 UTC m=+786.335276053" lastFinishedPulling="2025-11-25 15:48:23.542008345 +0000 UTC m=+789.810282126" observedRunningTime="2025-11-25 15:48:23.983306887 +0000 UTC m=+790.251580668" watchObservedRunningTime="2025-11-25 15:48:23.98444387 +0000 UTC m=+790.252717641" Nov 25 15:48:24 crc kubenswrapper[4704]: I1125 15:48:24.000820 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-lbpq9" podStartSLOduration=3.906116198 podStartE2EDuration="6.000763537s" podCreationTimestamp="2025-11-25 15:48:18 +0000 UTC" firstStartedPulling="2025-11-25 15:48:21.448337124 +0000 UTC m=+787.716610905" lastFinishedPulling="2025-11-25 15:48:23.542984463 +0000 UTC m=+789.811258244" observedRunningTime="2025-11-25 15:48:23.997853664 +0000 UTC m=+790.266127455" watchObservedRunningTime="2025-11-25 15:48:24.000763537 +0000 UTC m=+790.269037318" Nov 25 15:48:28 crc kubenswrapper[4704]: I1125 15:48:28.000599 4704 generic.go:334] "Generic (PLEG): container finished" podID="29b053f9-2ac9-4eeb-bb2b-adbe17dfab59" containerID="ba148b697736d0702641b3953506502405c5161b496be226e8c76a3a9dbd4c01" exitCode=0 Nov 25 15:48:28 crc kubenswrapper[4704]: I1125 15:48:28.000661 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dllmv" event={"ID":"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59","Type":"ContainerDied","Data":"ba148b697736d0702641b3953506502405c5161b496be226e8c76a3a9dbd4c01"} Nov 25 15:48:28 crc kubenswrapper[4704]: I1125 15:48:28.002956 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-l8hl8" event={"ID":"ebf01769-76d3-4f64-bd68-f20add5a1266","Type":"ContainerStarted","Data":"5fe76ca43a890e506fc4aa6247c7c9ae099a7cdf521f2d2d5a051a6223208850"} Nov 25 15:48:28 crc kubenswrapper[4704]: I1125 15:48:28.003114 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-6998585d5-l8hl8" Nov 25 15:48:28 crc kubenswrapper[4704]: I1125 15:48:28.056857 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-6998585d5-l8hl8" podStartSLOduration=3.697900602 podStartE2EDuration="10.056831356s" podCreationTimestamp="2025-11-25 15:48:18 +0000 UTC" firstStartedPulling="2025-11-25 15:48:20.780994565 +0000 UTC m=+787.049268346" lastFinishedPulling="2025-11-25 15:48:27.139925319 +0000 UTC m=+793.408199100" observedRunningTime="2025-11-25 15:48:28.054313404 +0000 UTC m=+794.322587215" watchObservedRunningTime="2025-11-25 15:48:28.056831356 +0000 UTC m=+794.325105147" Nov 25 15:48:29 crc kubenswrapper[4704]: I1125 15:48:29.018907 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dllmv" event={"ID":"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59","Type":"ContainerDied","Data":"feda750e7459aa95e3b4112280b889f474f2131b88ea7ecbfb37e330c277c695"} Nov 25 15:48:29 crc kubenswrapper[4704]: I1125 15:48:29.018876 4704 generic.go:334] "Generic (PLEG): container finished" podID="29b053f9-2ac9-4eeb-bb2b-adbe17dfab59" containerID="feda750e7459aa95e3b4112280b889f474f2131b88ea7ecbfb37e330c277c695" exitCode=0 Nov 25 15:48:30 crc kubenswrapper[4704]: I1125 15:48:30.026151 4704 generic.go:334] "Generic (PLEG): container finished" podID="29b053f9-2ac9-4eeb-bb2b-adbe17dfab59" containerID="0f1557cda2069d13bdba5b624ba3d237a3064a5fb95806c0a82ee0263341c22d" exitCode=0 Nov 25 15:48:30 crc kubenswrapper[4704]: I1125 15:48:30.026191 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dllmv" event={"ID":"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59","Type":"ContainerDied","Data":"0f1557cda2069d13bdba5b624ba3d237a3064a5fb95806c0a82ee0263341c22d"} Nov 25 15:48:31 crc kubenswrapper[4704]: I1125 15:48:31.037120 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dllmv" event={"ID":"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59","Type":"ContainerStarted","Data":"ff147cb0a5756ab91813babc17cfde803411f5bff48d13bd8a87688b00bb4fbe"} Nov 25 15:48:31 crc kubenswrapper[4704]: I1125 15:48:31.037386 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dllmv" event={"ID":"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59","Type":"ContainerStarted","Data":"00d2baad4da887b5c8432c232cf0f1fe655645b0c68f8963aa564f3eaa902596"} Nov 25 15:48:31 crc kubenswrapper[4704]: I1125 15:48:31.037397 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dllmv" event={"ID":"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59","Type":"ContainerStarted","Data":"80d77163b7af3b47fd5953b2df42a57a47d41714d33d0686f44c96dbe3889a84"} Nov 25 15:48:31 crc kubenswrapper[4704]: I1125 15:48:31.037406 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dllmv" event={"ID":"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59","Type":"ContainerStarted","Data":"257fd9ddd76479f0e5bb77bf2a62a1ddd256e3d7d1e0bea40b955090b3a39c46"} Nov 25 15:48:31 crc kubenswrapper[4704]: I1125 15:48:31.037416 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dllmv" event={"ID":"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59","Type":"ContainerStarted","Data":"ed93388fed7bc4a72db07e9057d02b3ff0aafd779c483dab20890fef8d221575"} Nov 25 15:48:31 crc kubenswrapper[4704]: I1125 15:48:31.092426 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-lbpq9" Nov 25 15:48:32 crc kubenswrapper[4704]: I1125 15:48:32.046287 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-dllmv" event={"ID":"29b053f9-2ac9-4eeb-bb2b-adbe17dfab59","Type":"ContainerStarted","Data":"1775d041ef1300734302f6367aa8bccf209ee0c4a526787683c5efe914c334ac"} Nov 25 15:48:32 crc kubenswrapper[4704]: I1125 15:48:32.046434 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-dllmv" Nov 25 15:48:32 crc kubenswrapper[4704]: I1125 15:48:32.066969 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-dllmv" podStartSLOduration=7.223926308 podStartE2EDuration="14.066942579s" podCreationTimestamp="2025-11-25 15:48:18 +0000 UTC" firstStartedPulling="2025-11-25 15:48:20.279933252 +0000 UTC m=+786.548207033" lastFinishedPulling="2025-11-25 15:48:27.122949523 +0000 UTC m=+793.391223304" observedRunningTime="2025-11-25 15:48:32.065748335 +0000 UTC m=+798.334022126" watchObservedRunningTime="2025-11-25 15:48:32.066942579 +0000 UTC m=+798.335216370" Nov 25 15:48:35 crc kubenswrapper[4704]: I1125 15:48:35.150083 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-dllmv" Nov 25 15:48:35 crc kubenswrapper[4704]: I1125 15:48:35.188429 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-dllmv" Nov 25 15:48:37 crc kubenswrapper[4704]: I1125 15:48:37.414352 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-k7ps5"] Nov 25 15:48:37 crc kubenswrapper[4704]: I1125 15:48:37.415256 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-k7ps5" Nov 25 15:48:37 crc kubenswrapper[4704]: I1125 15:48:37.419406 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 25 15:48:37 crc kubenswrapper[4704]: I1125 15:48:37.419704 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-454s6" Nov 25 15:48:37 crc kubenswrapper[4704]: I1125 15:48:37.425074 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 25 15:48:37 crc kubenswrapper[4704]: I1125 15:48:37.429419 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-k7ps5"] Nov 25 15:48:37 crc kubenswrapper[4704]: I1125 15:48:37.563559 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6qhr\" (UniqueName: \"kubernetes.io/projected/5d06e72d-3674-4274-9c1f-a30b3383b2c9-kube-api-access-l6qhr\") pod \"mariadb-operator-index-k7ps5\" (UID: \"5d06e72d-3674-4274-9c1f-a30b3383b2c9\") " pod="openstack-operators/mariadb-operator-index-k7ps5" Nov 25 15:48:37 crc kubenswrapper[4704]: I1125 15:48:37.664372 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6qhr\" (UniqueName: \"kubernetes.io/projected/5d06e72d-3674-4274-9c1f-a30b3383b2c9-kube-api-access-l6qhr\") pod \"mariadb-operator-index-k7ps5\" (UID: \"5d06e72d-3674-4274-9c1f-a30b3383b2c9\") " pod="openstack-operators/mariadb-operator-index-k7ps5" Nov 25 15:48:37 crc kubenswrapper[4704]: I1125 15:48:37.688423 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6qhr\" (UniqueName: \"kubernetes.io/projected/5d06e72d-3674-4274-9c1f-a30b3383b2c9-kube-api-access-l6qhr\") pod \"mariadb-operator-index-k7ps5\" (UID: \"5d06e72d-3674-4274-9c1f-a30b3383b2c9\") " pod="openstack-operators/mariadb-operator-index-k7ps5" Nov 25 15:48:37 crc kubenswrapper[4704]: I1125 15:48:37.743740 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-k7ps5" Nov 25 15:48:37 crc kubenswrapper[4704]: I1125 15:48:37.964546 4704 patch_prober.go:28] interesting pod/machine-config-daemon-djz8x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:48:37 crc kubenswrapper[4704]: I1125 15:48:37.965054 4704 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:48:38 crc kubenswrapper[4704]: I1125 15:48:38.174948 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-k7ps5"] Nov 25 15:48:38 crc kubenswrapper[4704]: W1125 15:48:38.182998 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d06e72d_3674_4274_9c1f_a30b3383b2c9.slice/crio-67a5d31a2e3f24ab9f56e687b213eb012f63cf7149dfee2e932c43829d2f6842 WatchSource:0}: Error finding container 67a5d31a2e3f24ab9f56e687b213eb012f63cf7149dfee2e932c43829d2f6842: Status 404 returned error can't find the container with id 67a5d31a2e3f24ab9f56e687b213eb012f63cf7149dfee2e932c43829d2f6842 Nov 25 15:48:38 crc kubenswrapper[4704]: I1125 15:48:38.988290 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6c7b4b5f48-2bp8m" Nov 25 15:48:39 crc kubenswrapper[4704]: I1125 15:48:39.121685 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-k7ps5" event={"ID":"5d06e72d-3674-4274-9c1f-a30b3383b2c9","Type":"ContainerStarted","Data":"67a5d31a2e3f24ab9f56e687b213eb012f63cf7149dfee2e932c43829d2f6842"} Nov 25 15:48:40 crc kubenswrapper[4704]: I1125 15:48:40.128195 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-k7ps5" event={"ID":"5d06e72d-3674-4274-9c1f-a30b3383b2c9","Type":"ContainerStarted","Data":"d4b32fef20abaea57f4e09fc5f57fa3c09d6eb4352f4e142c00f5e27fae784f2"} Nov 25 15:48:40 crc kubenswrapper[4704]: I1125 15:48:40.143453 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-k7ps5" podStartSLOduration=1.494042547 podStartE2EDuration="3.143430009s" podCreationTimestamp="2025-11-25 15:48:37 +0000 UTC" firstStartedPulling="2025-11-25 15:48:38.185567269 +0000 UTC m=+804.453841050" lastFinishedPulling="2025-11-25 15:48:39.834954731 +0000 UTC m=+806.103228512" observedRunningTime="2025-11-25 15:48:40.14170134 +0000 UTC m=+806.409975141" watchObservedRunningTime="2025-11-25 15:48:40.143430009 +0000 UTC m=+806.411703800" Nov 25 15:48:40 crc kubenswrapper[4704]: I1125 15:48:40.152758 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-dllmv" Nov 25 15:48:40 crc kubenswrapper[4704]: I1125 15:48:40.386547 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-6998585d5-l8hl8" Nov 25 15:48:40 crc kubenswrapper[4704]: I1125 15:48:40.795746 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-k7ps5"] Nov 25 15:48:41 crc kubenswrapper[4704]: I1125 15:48:41.400175 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-cmwzk"] Nov 25 15:48:41 crc kubenswrapper[4704]: I1125 15:48:41.401051 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-cmwzk" Nov 25 15:48:41 crc kubenswrapper[4704]: I1125 15:48:41.409832 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-cmwzk"] Nov 25 15:48:41 crc kubenswrapper[4704]: I1125 15:48:41.522103 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gwms\" (UniqueName: \"kubernetes.io/projected/9997000b-ed10-4ef0-9456-02578320964d-kube-api-access-4gwms\") pod \"mariadb-operator-index-cmwzk\" (UID: \"9997000b-ed10-4ef0-9456-02578320964d\") " pod="openstack-operators/mariadb-operator-index-cmwzk" Nov 25 15:48:41 crc kubenswrapper[4704]: I1125 15:48:41.624404 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gwms\" (UniqueName: \"kubernetes.io/projected/9997000b-ed10-4ef0-9456-02578320964d-kube-api-access-4gwms\") pod \"mariadb-operator-index-cmwzk\" (UID: \"9997000b-ed10-4ef0-9456-02578320964d\") " pod="openstack-operators/mariadb-operator-index-cmwzk" Nov 25 15:48:41 crc kubenswrapper[4704]: I1125 15:48:41.646032 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gwms\" (UniqueName: \"kubernetes.io/projected/9997000b-ed10-4ef0-9456-02578320964d-kube-api-access-4gwms\") pod \"mariadb-operator-index-cmwzk\" (UID: \"9997000b-ed10-4ef0-9456-02578320964d\") " pod="openstack-operators/mariadb-operator-index-cmwzk" Nov 25 15:48:41 crc kubenswrapper[4704]: I1125 15:48:41.720346 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-cmwzk" Nov 25 15:48:42 crc kubenswrapper[4704]: I1125 15:48:42.116133 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-cmwzk"] Nov 25 15:48:42 crc kubenswrapper[4704]: I1125 15:48:42.142156 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-cmwzk" event={"ID":"9997000b-ed10-4ef0-9456-02578320964d","Type":"ContainerStarted","Data":"aad7a3e375eebc43acc3253e5a8339e5ae4cb07678e388725e29293b5dd219d2"} Nov 25 15:48:42 crc kubenswrapper[4704]: I1125 15:48:42.142306 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-k7ps5" podUID="5d06e72d-3674-4274-9c1f-a30b3383b2c9" containerName="registry-server" containerID="cri-o://d4b32fef20abaea57f4e09fc5f57fa3c09d6eb4352f4e142c00f5e27fae784f2" gracePeriod=2 Nov 25 15:48:42 crc kubenswrapper[4704]: I1125 15:48:42.571698 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-k7ps5" Nov 25 15:48:42 crc kubenswrapper[4704]: I1125 15:48:42.738976 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6qhr\" (UniqueName: \"kubernetes.io/projected/5d06e72d-3674-4274-9c1f-a30b3383b2c9-kube-api-access-l6qhr\") pod \"5d06e72d-3674-4274-9c1f-a30b3383b2c9\" (UID: \"5d06e72d-3674-4274-9c1f-a30b3383b2c9\") " Nov 25 15:48:42 crc kubenswrapper[4704]: I1125 15:48:42.746439 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d06e72d-3674-4274-9c1f-a30b3383b2c9-kube-api-access-l6qhr" (OuterVolumeSpecName: "kube-api-access-l6qhr") pod "5d06e72d-3674-4274-9c1f-a30b3383b2c9" (UID: "5d06e72d-3674-4274-9c1f-a30b3383b2c9"). InnerVolumeSpecName "kube-api-access-l6qhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:48:42 crc kubenswrapper[4704]: I1125 15:48:42.841770 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6qhr\" (UniqueName: \"kubernetes.io/projected/5d06e72d-3674-4274-9c1f-a30b3383b2c9-kube-api-access-l6qhr\") on node \"crc\" DevicePath \"\"" Nov 25 15:48:43 crc kubenswrapper[4704]: I1125 15:48:43.149359 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-cmwzk" event={"ID":"9997000b-ed10-4ef0-9456-02578320964d","Type":"ContainerStarted","Data":"a1e670cdab5073e054a50fceee808e65407b94a29bcd7075cf3a47531940a3a5"} Nov 25 15:48:43 crc kubenswrapper[4704]: I1125 15:48:43.153208 4704 generic.go:334] "Generic (PLEG): container finished" podID="5d06e72d-3674-4274-9c1f-a30b3383b2c9" containerID="d4b32fef20abaea57f4e09fc5f57fa3c09d6eb4352f4e142c00f5e27fae784f2" exitCode=0 Nov 25 15:48:43 crc kubenswrapper[4704]: I1125 15:48:43.153250 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-k7ps5" event={"ID":"5d06e72d-3674-4274-9c1f-a30b3383b2c9","Type":"ContainerDied","Data":"d4b32fef20abaea57f4e09fc5f57fa3c09d6eb4352f4e142c00f5e27fae784f2"} Nov 25 15:48:43 crc kubenswrapper[4704]: I1125 15:48:43.153271 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-k7ps5" Nov 25 15:48:43 crc kubenswrapper[4704]: I1125 15:48:43.153304 4704 scope.go:117] "RemoveContainer" containerID="d4b32fef20abaea57f4e09fc5f57fa3c09d6eb4352f4e142c00f5e27fae784f2" Nov 25 15:48:43 crc kubenswrapper[4704]: I1125 15:48:43.153290 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-k7ps5" event={"ID":"5d06e72d-3674-4274-9c1f-a30b3383b2c9","Type":"ContainerDied","Data":"67a5d31a2e3f24ab9f56e687b213eb012f63cf7149dfee2e932c43829d2f6842"} Nov 25 15:48:43 crc kubenswrapper[4704]: I1125 15:48:43.174071 4704 scope.go:117] "RemoveContainer" containerID="d4b32fef20abaea57f4e09fc5f57fa3c09d6eb4352f4e142c00f5e27fae784f2" Nov 25 15:48:43 crc kubenswrapper[4704]: E1125 15:48:43.175494 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4b32fef20abaea57f4e09fc5f57fa3c09d6eb4352f4e142c00f5e27fae784f2\": container with ID starting with d4b32fef20abaea57f4e09fc5f57fa3c09d6eb4352f4e142c00f5e27fae784f2 not found: ID does not exist" containerID="d4b32fef20abaea57f4e09fc5f57fa3c09d6eb4352f4e142c00f5e27fae784f2" Nov 25 15:48:43 crc kubenswrapper[4704]: I1125 15:48:43.175561 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4b32fef20abaea57f4e09fc5f57fa3c09d6eb4352f4e142c00f5e27fae784f2"} err="failed to get container status \"d4b32fef20abaea57f4e09fc5f57fa3c09d6eb4352f4e142c00f5e27fae784f2\": rpc error: code = NotFound desc = could not find container \"d4b32fef20abaea57f4e09fc5f57fa3c09d6eb4352f4e142c00f5e27fae784f2\": container with ID starting with d4b32fef20abaea57f4e09fc5f57fa3c09d6eb4352f4e142c00f5e27fae784f2 not found: ID does not exist" Nov 25 15:48:43 crc kubenswrapper[4704]: I1125 15:48:43.197113 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-k7ps5"] Nov 25 15:48:43 crc kubenswrapper[4704]: I1125 15:48:43.201503 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-k7ps5"] Nov 25 15:48:44 crc kubenswrapper[4704]: I1125 15:48:44.177211 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-cmwzk" podStartSLOduration=2.411868436 podStartE2EDuration="3.177185711s" podCreationTimestamp="2025-11-25 15:48:41 +0000 UTC" firstStartedPulling="2025-11-25 15:48:42.133295245 +0000 UTC m=+808.401569026" lastFinishedPulling="2025-11-25 15:48:42.89861252 +0000 UTC m=+809.166886301" observedRunningTime="2025-11-25 15:48:44.174164814 +0000 UTC m=+810.442438595" watchObservedRunningTime="2025-11-25 15:48:44.177185711 +0000 UTC m=+810.445459512" Nov 25 15:48:44 crc kubenswrapper[4704]: I1125 15:48:44.423972 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d06e72d-3674-4274-9c1f-a30b3383b2c9" path="/var/lib/kubelet/pods/5d06e72d-3674-4274-9c1f-a30b3383b2c9/volumes" Nov 25 15:48:46 crc kubenswrapper[4704]: I1125 15:48:46.406471 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-svwq7"] Nov 25 15:48:46 crc kubenswrapper[4704]: E1125 15:48:46.406720 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d06e72d-3674-4274-9c1f-a30b3383b2c9" containerName="registry-server" Nov 25 15:48:46 crc kubenswrapper[4704]: I1125 15:48:46.406732 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d06e72d-3674-4274-9c1f-a30b3383b2c9" containerName="registry-server" Nov 25 15:48:46 crc kubenswrapper[4704]: I1125 15:48:46.406857 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d06e72d-3674-4274-9c1f-a30b3383b2c9" containerName="registry-server" Nov 25 15:48:46 crc kubenswrapper[4704]: I1125 15:48:46.407710 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-svwq7" Nov 25 15:48:46 crc kubenswrapper[4704]: I1125 15:48:46.425696 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-svwq7"] Nov 25 15:48:46 crc kubenswrapper[4704]: I1125 15:48:46.485016 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d1fed25-0c00-4b4e-ae49-3871f29375d6-catalog-content\") pod \"redhat-marketplace-svwq7\" (UID: \"7d1fed25-0c00-4b4e-ae49-3871f29375d6\") " pod="openshift-marketplace/redhat-marketplace-svwq7" Nov 25 15:48:46 crc kubenswrapper[4704]: I1125 15:48:46.485554 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twmj9\" (UniqueName: \"kubernetes.io/projected/7d1fed25-0c00-4b4e-ae49-3871f29375d6-kube-api-access-twmj9\") pod \"redhat-marketplace-svwq7\" (UID: \"7d1fed25-0c00-4b4e-ae49-3871f29375d6\") " pod="openshift-marketplace/redhat-marketplace-svwq7" Nov 25 15:48:46 crc kubenswrapper[4704]: I1125 15:48:46.485590 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d1fed25-0c00-4b4e-ae49-3871f29375d6-utilities\") pod \"redhat-marketplace-svwq7\" (UID: \"7d1fed25-0c00-4b4e-ae49-3871f29375d6\") " pod="openshift-marketplace/redhat-marketplace-svwq7" Nov 25 15:48:46 crc kubenswrapper[4704]: I1125 15:48:46.587328 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d1fed25-0c00-4b4e-ae49-3871f29375d6-catalog-content\") pod \"redhat-marketplace-svwq7\" (UID: \"7d1fed25-0c00-4b4e-ae49-3871f29375d6\") " pod="openshift-marketplace/redhat-marketplace-svwq7" Nov 25 15:48:46 crc kubenswrapper[4704]: I1125 15:48:46.587685 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twmj9\" (UniqueName: \"kubernetes.io/projected/7d1fed25-0c00-4b4e-ae49-3871f29375d6-kube-api-access-twmj9\") pod \"redhat-marketplace-svwq7\" (UID: \"7d1fed25-0c00-4b4e-ae49-3871f29375d6\") " pod="openshift-marketplace/redhat-marketplace-svwq7" Nov 25 15:48:46 crc kubenswrapper[4704]: I1125 15:48:46.587922 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d1fed25-0c00-4b4e-ae49-3871f29375d6-utilities\") pod \"redhat-marketplace-svwq7\" (UID: \"7d1fed25-0c00-4b4e-ae49-3871f29375d6\") " pod="openshift-marketplace/redhat-marketplace-svwq7" Nov 25 15:48:46 crc kubenswrapper[4704]: I1125 15:48:46.588060 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d1fed25-0c00-4b4e-ae49-3871f29375d6-catalog-content\") pod \"redhat-marketplace-svwq7\" (UID: \"7d1fed25-0c00-4b4e-ae49-3871f29375d6\") " pod="openshift-marketplace/redhat-marketplace-svwq7" Nov 25 15:48:46 crc kubenswrapper[4704]: I1125 15:48:46.588245 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d1fed25-0c00-4b4e-ae49-3871f29375d6-utilities\") pod \"redhat-marketplace-svwq7\" (UID: \"7d1fed25-0c00-4b4e-ae49-3871f29375d6\") " pod="openshift-marketplace/redhat-marketplace-svwq7" Nov 25 15:48:46 crc kubenswrapper[4704]: I1125 15:48:46.611134 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twmj9\" (UniqueName: \"kubernetes.io/projected/7d1fed25-0c00-4b4e-ae49-3871f29375d6-kube-api-access-twmj9\") pod \"redhat-marketplace-svwq7\" (UID: \"7d1fed25-0c00-4b4e-ae49-3871f29375d6\") " pod="openshift-marketplace/redhat-marketplace-svwq7" Nov 25 15:48:46 crc kubenswrapper[4704]: I1125 15:48:46.726543 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-svwq7" Nov 25 15:48:47 crc kubenswrapper[4704]: I1125 15:48:47.170385 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-svwq7"] Nov 25 15:48:47 crc kubenswrapper[4704]: I1125 15:48:47.186311 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svwq7" event={"ID":"7d1fed25-0c00-4b4e-ae49-3871f29375d6","Type":"ContainerStarted","Data":"1642a2c42494700650a172aaaa7e60dcb72bcf97e74f81877216ab640b8ed661"} Nov 25 15:48:48 crc kubenswrapper[4704]: I1125 15:48:48.193498 4704 generic.go:334] "Generic (PLEG): container finished" podID="7d1fed25-0c00-4b4e-ae49-3871f29375d6" containerID="8cbcee120e7c0f5b88d593164b19cb93c84e2b2b41f59612c685f2f3847275d7" exitCode=0 Nov 25 15:48:48 crc kubenswrapper[4704]: I1125 15:48:48.193557 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svwq7" event={"ID":"7d1fed25-0c00-4b4e-ae49-3871f29375d6","Type":"ContainerDied","Data":"8cbcee120e7c0f5b88d593164b19cb93c84e2b2b41f59612c685f2f3847275d7"} Nov 25 15:48:49 crc kubenswrapper[4704]: I1125 15:48:49.201574 4704 generic.go:334] "Generic (PLEG): container finished" podID="7d1fed25-0c00-4b4e-ae49-3871f29375d6" containerID="7b3b1e6f2af0bc53c0adad8e80e1a76a69a92f7d05c0a58188ea68b48609f825" exitCode=0 Nov 25 15:48:49 crc kubenswrapper[4704]: I1125 15:48:49.201670 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svwq7" event={"ID":"7d1fed25-0c00-4b4e-ae49-3871f29375d6","Type":"ContainerDied","Data":"7b3b1e6f2af0bc53c0adad8e80e1a76a69a92f7d05c0a58188ea68b48609f825"} Nov 25 15:48:50 crc kubenswrapper[4704]: I1125 15:48:50.211984 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svwq7" event={"ID":"7d1fed25-0c00-4b4e-ae49-3871f29375d6","Type":"ContainerStarted","Data":"db8628bc1056ec89b4450b0af2606be191bd8f8cd069b93f53ab972218cc59ff"} Nov 25 15:48:50 crc kubenswrapper[4704]: I1125 15:48:50.247774 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-svwq7" podStartSLOduration=2.811530807 podStartE2EDuration="4.247751093s" podCreationTimestamp="2025-11-25 15:48:46 +0000 UTC" firstStartedPulling="2025-11-25 15:48:48.196690713 +0000 UTC m=+814.464964494" lastFinishedPulling="2025-11-25 15:48:49.632910999 +0000 UTC m=+815.901184780" observedRunningTime="2025-11-25 15:48:50.245747476 +0000 UTC m=+816.514021277" watchObservedRunningTime="2025-11-25 15:48:50.247751093 +0000 UTC m=+816.516024894" Nov 25 15:48:51 crc kubenswrapper[4704]: I1125 15:48:51.206156 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2b4qj"] Nov 25 15:48:51 crc kubenswrapper[4704]: I1125 15:48:51.208028 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2b4qj" Nov 25 15:48:51 crc kubenswrapper[4704]: I1125 15:48:51.223897 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2b4qj"] Nov 25 15:48:51 crc kubenswrapper[4704]: I1125 15:48:51.350234 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a0ed576-6fe0-4c40-8d66-344b8c18d75d-catalog-content\") pod \"community-operators-2b4qj\" (UID: \"2a0ed576-6fe0-4c40-8d66-344b8c18d75d\") " pod="openshift-marketplace/community-operators-2b4qj" Nov 25 15:48:51 crc kubenswrapper[4704]: I1125 15:48:51.350677 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttqnm\" (UniqueName: \"kubernetes.io/projected/2a0ed576-6fe0-4c40-8d66-344b8c18d75d-kube-api-access-ttqnm\") pod \"community-operators-2b4qj\" (UID: \"2a0ed576-6fe0-4c40-8d66-344b8c18d75d\") " pod="openshift-marketplace/community-operators-2b4qj" Nov 25 15:48:51 crc kubenswrapper[4704]: I1125 15:48:51.350848 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a0ed576-6fe0-4c40-8d66-344b8c18d75d-utilities\") pod \"community-operators-2b4qj\" (UID: \"2a0ed576-6fe0-4c40-8d66-344b8c18d75d\") " pod="openshift-marketplace/community-operators-2b4qj" Nov 25 15:48:51 crc kubenswrapper[4704]: I1125 15:48:51.452508 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttqnm\" (UniqueName: \"kubernetes.io/projected/2a0ed576-6fe0-4c40-8d66-344b8c18d75d-kube-api-access-ttqnm\") pod \"community-operators-2b4qj\" (UID: \"2a0ed576-6fe0-4c40-8d66-344b8c18d75d\") " pod="openshift-marketplace/community-operators-2b4qj" Nov 25 15:48:51 crc kubenswrapper[4704]: I1125 15:48:51.452606 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a0ed576-6fe0-4c40-8d66-344b8c18d75d-utilities\") pod \"community-operators-2b4qj\" (UID: \"2a0ed576-6fe0-4c40-8d66-344b8c18d75d\") " pod="openshift-marketplace/community-operators-2b4qj" Nov 25 15:48:51 crc kubenswrapper[4704]: I1125 15:48:51.452655 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a0ed576-6fe0-4c40-8d66-344b8c18d75d-catalog-content\") pod \"community-operators-2b4qj\" (UID: \"2a0ed576-6fe0-4c40-8d66-344b8c18d75d\") " pod="openshift-marketplace/community-operators-2b4qj" Nov 25 15:48:51 crc kubenswrapper[4704]: I1125 15:48:51.453282 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a0ed576-6fe0-4c40-8d66-344b8c18d75d-catalog-content\") pod \"community-operators-2b4qj\" (UID: \"2a0ed576-6fe0-4c40-8d66-344b8c18d75d\") " pod="openshift-marketplace/community-operators-2b4qj" Nov 25 15:48:51 crc kubenswrapper[4704]: I1125 15:48:51.453743 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a0ed576-6fe0-4c40-8d66-344b8c18d75d-utilities\") pod \"community-operators-2b4qj\" (UID: \"2a0ed576-6fe0-4c40-8d66-344b8c18d75d\") " pod="openshift-marketplace/community-operators-2b4qj" Nov 25 15:48:51 crc kubenswrapper[4704]: I1125 15:48:51.483269 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttqnm\" (UniqueName: \"kubernetes.io/projected/2a0ed576-6fe0-4c40-8d66-344b8c18d75d-kube-api-access-ttqnm\") pod \"community-operators-2b4qj\" (UID: \"2a0ed576-6fe0-4c40-8d66-344b8c18d75d\") " pod="openshift-marketplace/community-operators-2b4qj" Nov 25 15:48:51 crc kubenswrapper[4704]: I1125 15:48:51.531156 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2b4qj" Nov 25 15:48:51 crc kubenswrapper[4704]: I1125 15:48:51.721004 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-cmwzk" Nov 25 15:48:51 crc kubenswrapper[4704]: I1125 15:48:51.721596 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-cmwzk" Nov 25 15:48:51 crc kubenswrapper[4704]: I1125 15:48:51.755636 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-cmwzk" Nov 25 15:48:51 crc kubenswrapper[4704]: I1125 15:48:51.802458 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2b4qj"] Nov 25 15:48:52 crc kubenswrapper[4704]: I1125 15:48:52.227426 4704 generic.go:334] "Generic (PLEG): container finished" podID="2a0ed576-6fe0-4c40-8d66-344b8c18d75d" containerID="5bfb4beb15a765c76bc393184d3e121c2b3787bfa86ebd004c8584cbe647f9cd" exitCode=0 Nov 25 15:48:52 crc kubenswrapper[4704]: I1125 15:48:52.227467 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2b4qj" event={"ID":"2a0ed576-6fe0-4c40-8d66-344b8c18d75d","Type":"ContainerDied","Data":"5bfb4beb15a765c76bc393184d3e121c2b3787bfa86ebd004c8584cbe647f9cd"} Nov 25 15:48:52 crc kubenswrapper[4704]: I1125 15:48:52.227994 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2b4qj" event={"ID":"2a0ed576-6fe0-4c40-8d66-344b8c18d75d","Type":"ContainerStarted","Data":"eccef833ae2a1ac66d44a0afb5e71c6df760fdd551f942d8d51f52423e7eb7a1"} Nov 25 15:48:52 crc kubenswrapper[4704]: I1125 15:48:52.257507 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-cmwzk" Nov 25 15:48:55 crc kubenswrapper[4704]: I1125 15:48:55.248940 4704 generic.go:334] "Generic (PLEG): container finished" podID="2a0ed576-6fe0-4c40-8d66-344b8c18d75d" containerID="16c2b4b90e9576d5ce2af0365cc6f81f53a2604b8ed4769e13c575b44cdba823" exitCode=0 Nov 25 15:48:55 crc kubenswrapper[4704]: I1125 15:48:55.249010 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2b4qj" event={"ID":"2a0ed576-6fe0-4c40-8d66-344b8c18d75d","Type":"ContainerDied","Data":"16c2b4b90e9576d5ce2af0365cc6f81f53a2604b8ed4769e13c575b44cdba823"} Nov 25 15:48:56 crc kubenswrapper[4704]: I1125 15:48:56.261536 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2b4qj" event={"ID":"2a0ed576-6fe0-4c40-8d66-344b8c18d75d","Type":"ContainerStarted","Data":"535e0d93ba35796c1cec49926100a9ada310cadfb2323d22ef704de7a5e23c24"} Nov 25 15:48:56 crc kubenswrapper[4704]: I1125 15:48:56.280264 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2b4qj" podStartSLOduration=1.54546474 podStartE2EDuration="5.280239106s" podCreationTimestamp="2025-11-25 15:48:51 +0000 UTC" firstStartedPulling="2025-11-25 15:48:52.230378353 +0000 UTC m=+818.498652134" lastFinishedPulling="2025-11-25 15:48:55.965152719 +0000 UTC m=+822.233426500" observedRunningTime="2025-11-25 15:48:56.278743963 +0000 UTC m=+822.547017744" watchObservedRunningTime="2025-11-25 15:48:56.280239106 +0000 UTC m=+822.548512887" Nov 25 15:48:56 crc kubenswrapper[4704]: I1125 15:48:56.726908 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-svwq7" Nov 25 15:48:56 crc kubenswrapper[4704]: I1125 15:48:56.727412 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-svwq7" Nov 25 15:48:56 crc kubenswrapper[4704]: I1125 15:48:56.772728 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-svwq7" Nov 25 15:48:57 crc kubenswrapper[4704]: I1125 15:48:57.312829 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-svwq7" Nov 25 15:48:59 crc kubenswrapper[4704]: I1125 15:48:59.243901 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56"] Nov 25 15:48:59 crc kubenswrapper[4704]: I1125 15:48:59.245717 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56" Nov 25 15:48:59 crc kubenswrapper[4704]: I1125 15:48:59.248055 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-8zdtm" Nov 25 15:48:59 crc kubenswrapper[4704]: I1125 15:48:59.255546 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56"] Nov 25 15:48:59 crc kubenswrapper[4704]: I1125 15:48:59.373422 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3649f81a-9887-4dba-91e7-66192abf74df-bundle\") pod \"c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56\" (UID: \"3649f81a-9887-4dba-91e7-66192abf74df\") " pod="openstack-operators/c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56" Nov 25 15:48:59 crc kubenswrapper[4704]: I1125 15:48:59.373832 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9pjc\" (UniqueName: \"kubernetes.io/projected/3649f81a-9887-4dba-91e7-66192abf74df-kube-api-access-w9pjc\") pod \"c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56\" (UID: \"3649f81a-9887-4dba-91e7-66192abf74df\") " pod="openstack-operators/c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56" Nov 25 15:48:59 crc kubenswrapper[4704]: I1125 15:48:59.373942 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3649f81a-9887-4dba-91e7-66192abf74df-util\") pod \"c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56\" (UID: \"3649f81a-9887-4dba-91e7-66192abf74df\") " pod="openstack-operators/c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56" Nov 25 15:48:59 crc kubenswrapper[4704]: I1125 15:48:59.475738 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3649f81a-9887-4dba-91e7-66192abf74df-bundle\") pod \"c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56\" (UID: \"3649f81a-9887-4dba-91e7-66192abf74df\") " pod="openstack-operators/c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56" Nov 25 15:48:59 crc kubenswrapper[4704]: I1125 15:48:59.475913 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9pjc\" (UniqueName: \"kubernetes.io/projected/3649f81a-9887-4dba-91e7-66192abf74df-kube-api-access-w9pjc\") pod \"c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56\" (UID: \"3649f81a-9887-4dba-91e7-66192abf74df\") " pod="openstack-operators/c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56" Nov 25 15:48:59 crc kubenswrapper[4704]: I1125 15:48:59.475984 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3649f81a-9887-4dba-91e7-66192abf74df-util\") pod \"c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56\" (UID: \"3649f81a-9887-4dba-91e7-66192abf74df\") " pod="openstack-operators/c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56" Nov 25 15:48:59 crc kubenswrapper[4704]: I1125 15:48:59.476692 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3649f81a-9887-4dba-91e7-66192abf74df-util\") pod \"c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56\" (UID: \"3649f81a-9887-4dba-91e7-66192abf74df\") " pod="openstack-operators/c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56" Nov 25 15:48:59 crc kubenswrapper[4704]: I1125 15:48:59.477194 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3649f81a-9887-4dba-91e7-66192abf74df-bundle\") pod \"c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56\" (UID: \"3649f81a-9887-4dba-91e7-66192abf74df\") " pod="openstack-operators/c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56" Nov 25 15:48:59 crc kubenswrapper[4704]: I1125 15:48:59.502615 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9pjc\" (UniqueName: \"kubernetes.io/projected/3649f81a-9887-4dba-91e7-66192abf74df-kube-api-access-w9pjc\") pod \"c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56\" (UID: \"3649f81a-9887-4dba-91e7-66192abf74df\") " pod="openstack-operators/c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56" Nov 25 15:48:59 crc kubenswrapper[4704]: I1125 15:48:59.569425 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56" Nov 25 15:48:59 crc kubenswrapper[4704]: I1125 15:48:59.981390 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56"] Nov 25 15:48:59 crc kubenswrapper[4704]: W1125 15:48:59.988117 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3649f81a_9887_4dba_91e7_66192abf74df.slice/crio-e0f97e3e83f9bd1ab07524324692509f0025730b3880fd8e750a356315aae137 WatchSource:0}: Error finding container e0f97e3e83f9bd1ab07524324692509f0025730b3880fd8e750a356315aae137: Status 404 returned error can't find the container with id e0f97e3e83f9bd1ab07524324692509f0025730b3880fd8e750a356315aae137 Nov 25 15:49:00 crc kubenswrapper[4704]: I1125 15:49:00.288594 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56" event={"ID":"3649f81a-9887-4dba-91e7-66192abf74df","Type":"ContainerStarted","Data":"e0f97e3e83f9bd1ab07524324692509f0025730b3880fd8e750a356315aae137"} Nov 25 15:49:00 crc kubenswrapper[4704]: I1125 15:49:00.594698 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-svwq7"] Nov 25 15:49:00 crc kubenswrapper[4704]: I1125 15:49:00.595069 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-svwq7" podUID="7d1fed25-0c00-4b4e-ae49-3871f29375d6" containerName="registry-server" containerID="cri-o://db8628bc1056ec89b4450b0af2606be191bd8f8cd069b93f53ab972218cc59ff" gracePeriod=2 Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.017402 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-svwq7" Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.101821 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d1fed25-0c00-4b4e-ae49-3871f29375d6-utilities\") pod \"7d1fed25-0c00-4b4e-ae49-3871f29375d6\" (UID: \"7d1fed25-0c00-4b4e-ae49-3871f29375d6\") " Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.101926 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d1fed25-0c00-4b4e-ae49-3871f29375d6-catalog-content\") pod \"7d1fed25-0c00-4b4e-ae49-3871f29375d6\" (UID: \"7d1fed25-0c00-4b4e-ae49-3871f29375d6\") " Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.101949 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twmj9\" (UniqueName: \"kubernetes.io/projected/7d1fed25-0c00-4b4e-ae49-3871f29375d6-kube-api-access-twmj9\") pod \"7d1fed25-0c00-4b4e-ae49-3871f29375d6\" (UID: \"7d1fed25-0c00-4b4e-ae49-3871f29375d6\") " Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.103092 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d1fed25-0c00-4b4e-ae49-3871f29375d6-utilities" (OuterVolumeSpecName: "utilities") pod "7d1fed25-0c00-4b4e-ae49-3871f29375d6" (UID: "7d1fed25-0c00-4b4e-ae49-3871f29375d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.109840 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d1fed25-0c00-4b4e-ae49-3871f29375d6-kube-api-access-twmj9" (OuterVolumeSpecName: "kube-api-access-twmj9") pod "7d1fed25-0c00-4b4e-ae49-3871f29375d6" (UID: "7d1fed25-0c00-4b4e-ae49-3871f29375d6"). InnerVolumeSpecName "kube-api-access-twmj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.121512 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d1fed25-0c00-4b4e-ae49-3871f29375d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d1fed25-0c00-4b4e-ae49-3871f29375d6" (UID: "7d1fed25-0c00-4b4e-ae49-3871f29375d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.203708 4704 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d1fed25-0c00-4b4e-ae49-3871f29375d6-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.203766 4704 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d1fed25-0c00-4b4e-ae49-3871f29375d6-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.203806 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twmj9\" (UniqueName: \"kubernetes.io/projected/7d1fed25-0c00-4b4e-ae49-3871f29375d6-kube-api-access-twmj9\") on node \"crc\" DevicePath \"\"" Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.296387 4704 generic.go:334] "Generic (PLEG): container finished" podID="3649f81a-9887-4dba-91e7-66192abf74df" containerID="31e53a6f810c9ba17e6cae01745863064a933dbb9ace7e2835dda892e346cecf" exitCode=0 Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.296481 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56" event={"ID":"3649f81a-9887-4dba-91e7-66192abf74df","Type":"ContainerDied","Data":"31e53a6f810c9ba17e6cae01745863064a933dbb9ace7e2835dda892e346cecf"} Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.302415 4704 generic.go:334] "Generic (PLEG): container finished" podID="7d1fed25-0c00-4b4e-ae49-3871f29375d6" containerID="db8628bc1056ec89b4450b0af2606be191bd8f8cd069b93f53ab972218cc59ff" exitCode=0 Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.302467 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svwq7" event={"ID":"7d1fed25-0c00-4b4e-ae49-3871f29375d6","Type":"ContainerDied","Data":"db8628bc1056ec89b4450b0af2606be191bd8f8cd069b93f53ab972218cc59ff"} Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.302501 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svwq7" event={"ID":"7d1fed25-0c00-4b4e-ae49-3871f29375d6","Type":"ContainerDied","Data":"1642a2c42494700650a172aaaa7e60dcb72bcf97e74f81877216ab640b8ed661"} Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.302533 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-svwq7" Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.302550 4704 scope.go:117] "RemoveContainer" containerID="db8628bc1056ec89b4450b0af2606be191bd8f8cd069b93f53ab972218cc59ff" Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.335517 4704 scope.go:117] "RemoveContainer" containerID="7b3b1e6f2af0bc53c0adad8e80e1a76a69a92f7d05c0a58188ea68b48609f825" Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.350606 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-svwq7"] Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.354642 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-svwq7"] Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.371062 4704 scope.go:117] "RemoveContainer" containerID="8cbcee120e7c0f5b88d593164b19cb93c84e2b2b41f59612c685f2f3847275d7" Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.392052 4704 scope.go:117] "RemoveContainer" containerID="db8628bc1056ec89b4450b0af2606be191bd8f8cd069b93f53ab972218cc59ff" Nov 25 15:49:01 crc kubenswrapper[4704]: E1125 15:49:01.392815 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db8628bc1056ec89b4450b0af2606be191bd8f8cd069b93f53ab972218cc59ff\": container with ID starting with db8628bc1056ec89b4450b0af2606be191bd8f8cd069b93f53ab972218cc59ff not found: ID does not exist" containerID="db8628bc1056ec89b4450b0af2606be191bd8f8cd069b93f53ab972218cc59ff" Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.392877 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db8628bc1056ec89b4450b0af2606be191bd8f8cd069b93f53ab972218cc59ff"} err="failed to get container status \"db8628bc1056ec89b4450b0af2606be191bd8f8cd069b93f53ab972218cc59ff\": rpc error: code = NotFound desc = could not find container \"db8628bc1056ec89b4450b0af2606be191bd8f8cd069b93f53ab972218cc59ff\": container with ID starting with db8628bc1056ec89b4450b0af2606be191bd8f8cd069b93f53ab972218cc59ff not found: ID does not exist" Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.392922 4704 scope.go:117] "RemoveContainer" containerID="7b3b1e6f2af0bc53c0adad8e80e1a76a69a92f7d05c0a58188ea68b48609f825" Nov 25 15:49:01 crc kubenswrapper[4704]: E1125 15:49:01.393297 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b3b1e6f2af0bc53c0adad8e80e1a76a69a92f7d05c0a58188ea68b48609f825\": container with ID starting with 7b3b1e6f2af0bc53c0adad8e80e1a76a69a92f7d05c0a58188ea68b48609f825 not found: ID does not exist" containerID="7b3b1e6f2af0bc53c0adad8e80e1a76a69a92f7d05c0a58188ea68b48609f825" Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.393328 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b3b1e6f2af0bc53c0adad8e80e1a76a69a92f7d05c0a58188ea68b48609f825"} err="failed to get container status \"7b3b1e6f2af0bc53c0adad8e80e1a76a69a92f7d05c0a58188ea68b48609f825\": rpc error: code = NotFound desc = could not find container \"7b3b1e6f2af0bc53c0adad8e80e1a76a69a92f7d05c0a58188ea68b48609f825\": container with ID starting with 7b3b1e6f2af0bc53c0adad8e80e1a76a69a92f7d05c0a58188ea68b48609f825 not found: ID does not exist" Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.393357 4704 scope.go:117] "RemoveContainer" containerID="8cbcee120e7c0f5b88d593164b19cb93c84e2b2b41f59612c685f2f3847275d7" Nov 25 15:49:01 crc kubenswrapper[4704]: E1125 15:49:01.393741 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cbcee120e7c0f5b88d593164b19cb93c84e2b2b41f59612c685f2f3847275d7\": container with ID starting with 8cbcee120e7c0f5b88d593164b19cb93c84e2b2b41f59612c685f2f3847275d7 not found: ID does not exist" containerID="8cbcee120e7c0f5b88d593164b19cb93c84e2b2b41f59612c685f2f3847275d7" Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.393826 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cbcee120e7c0f5b88d593164b19cb93c84e2b2b41f59612c685f2f3847275d7"} err="failed to get container status \"8cbcee120e7c0f5b88d593164b19cb93c84e2b2b41f59612c685f2f3847275d7\": rpc error: code = NotFound desc = could not find container \"8cbcee120e7c0f5b88d593164b19cb93c84e2b2b41f59612c685f2f3847275d7\": container with ID starting with 8cbcee120e7c0f5b88d593164b19cb93c84e2b2b41f59612c685f2f3847275d7 not found: ID does not exist" Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.532073 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2b4qj" Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.532171 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2b4qj" Nov 25 15:49:01 crc kubenswrapper[4704]: I1125 15:49:01.580891 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2b4qj" Nov 25 15:49:02 crc kubenswrapper[4704]: I1125 15:49:02.311091 4704 generic.go:334] "Generic (PLEG): container finished" podID="3649f81a-9887-4dba-91e7-66192abf74df" containerID="1180664a371de0b93cc42308c555096a627c0c1d2ed558fa9e9298a865bbdf40" exitCode=0 Nov 25 15:49:02 crc kubenswrapper[4704]: I1125 15:49:02.311284 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56" event={"ID":"3649f81a-9887-4dba-91e7-66192abf74df","Type":"ContainerDied","Data":"1180664a371de0b93cc42308c555096a627c0c1d2ed558fa9e9298a865bbdf40"} Nov 25 15:49:02 crc kubenswrapper[4704]: I1125 15:49:02.367185 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2b4qj" Nov 25 15:49:02 crc kubenswrapper[4704]: I1125 15:49:02.424230 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d1fed25-0c00-4b4e-ae49-3871f29375d6" path="/var/lib/kubelet/pods/7d1fed25-0c00-4b4e-ae49-3871f29375d6/volumes" Nov 25 15:49:03 crc kubenswrapper[4704]: I1125 15:49:03.320361 4704 generic.go:334] "Generic (PLEG): container finished" podID="3649f81a-9887-4dba-91e7-66192abf74df" containerID="6e9b2c37b8d4b1458a414e174d878f5f0169e0d5713d2d78b90abe8ef5b6eeb3" exitCode=0 Nov 25 15:49:03 crc kubenswrapper[4704]: I1125 15:49:03.321783 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56" event={"ID":"3649f81a-9887-4dba-91e7-66192abf74df","Type":"ContainerDied","Data":"6e9b2c37b8d4b1458a414e174d878f5f0169e0d5713d2d78b90abe8ef5b6eeb3"} Nov 25 15:49:04 crc kubenswrapper[4704]: I1125 15:49:04.608282 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56" Nov 25 15:49:04 crc kubenswrapper[4704]: I1125 15:49:04.755571 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3649f81a-9887-4dba-91e7-66192abf74df-util\") pod \"3649f81a-9887-4dba-91e7-66192abf74df\" (UID: \"3649f81a-9887-4dba-91e7-66192abf74df\") " Nov 25 15:49:04 crc kubenswrapper[4704]: I1125 15:49:04.755683 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3649f81a-9887-4dba-91e7-66192abf74df-bundle\") pod \"3649f81a-9887-4dba-91e7-66192abf74df\" (UID: \"3649f81a-9887-4dba-91e7-66192abf74df\") " Nov 25 15:49:04 crc kubenswrapper[4704]: I1125 15:49:04.755711 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9pjc\" (UniqueName: \"kubernetes.io/projected/3649f81a-9887-4dba-91e7-66192abf74df-kube-api-access-w9pjc\") pod \"3649f81a-9887-4dba-91e7-66192abf74df\" (UID: \"3649f81a-9887-4dba-91e7-66192abf74df\") " Nov 25 15:49:04 crc kubenswrapper[4704]: I1125 15:49:04.757413 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3649f81a-9887-4dba-91e7-66192abf74df-bundle" (OuterVolumeSpecName: "bundle") pod "3649f81a-9887-4dba-91e7-66192abf74df" (UID: "3649f81a-9887-4dba-91e7-66192abf74df"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:49:04 crc kubenswrapper[4704]: I1125 15:49:04.763200 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3649f81a-9887-4dba-91e7-66192abf74df-kube-api-access-w9pjc" (OuterVolumeSpecName: "kube-api-access-w9pjc") pod "3649f81a-9887-4dba-91e7-66192abf74df" (UID: "3649f81a-9887-4dba-91e7-66192abf74df"). InnerVolumeSpecName "kube-api-access-w9pjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:49:04 crc kubenswrapper[4704]: I1125 15:49:04.778090 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3649f81a-9887-4dba-91e7-66192abf74df-util" (OuterVolumeSpecName: "util") pod "3649f81a-9887-4dba-91e7-66192abf74df" (UID: "3649f81a-9887-4dba-91e7-66192abf74df"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:49:04 crc kubenswrapper[4704]: I1125 15:49:04.857227 4704 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3649f81a-9887-4dba-91e7-66192abf74df-util\") on node \"crc\" DevicePath \"\"" Nov 25 15:49:04 crc kubenswrapper[4704]: I1125 15:49:04.857269 4704 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3649f81a-9887-4dba-91e7-66192abf74df-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:49:04 crc kubenswrapper[4704]: I1125 15:49:04.857284 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9pjc\" (UniqueName: \"kubernetes.io/projected/3649f81a-9887-4dba-91e7-66192abf74df-kube-api-access-w9pjc\") on node \"crc\" DevicePath \"\"" Nov 25 15:49:05 crc kubenswrapper[4704]: I1125 15:49:05.006029 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z79ck"] Nov 25 15:49:05 crc kubenswrapper[4704]: E1125 15:49:05.006334 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3649f81a-9887-4dba-91e7-66192abf74df" containerName="extract" Nov 25 15:49:05 crc kubenswrapper[4704]: I1125 15:49:05.006354 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="3649f81a-9887-4dba-91e7-66192abf74df" containerName="extract" Nov 25 15:49:05 crc kubenswrapper[4704]: E1125 15:49:05.006374 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3649f81a-9887-4dba-91e7-66192abf74df" containerName="pull" Nov 25 15:49:05 crc kubenswrapper[4704]: I1125 15:49:05.006382 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="3649f81a-9887-4dba-91e7-66192abf74df" containerName="pull" Nov 25 15:49:05 crc kubenswrapper[4704]: E1125 15:49:05.006396 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d1fed25-0c00-4b4e-ae49-3871f29375d6" containerName="extract-utilities" Nov 25 15:49:05 crc kubenswrapper[4704]: I1125 15:49:05.006404 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d1fed25-0c00-4b4e-ae49-3871f29375d6" containerName="extract-utilities" Nov 25 15:49:05 crc kubenswrapper[4704]: E1125 15:49:05.006421 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d1fed25-0c00-4b4e-ae49-3871f29375d6" containerName="extract-content" Nov 25 15:49:05 crc kubenswrapper[4704]: I1125 15:49:05.006429 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d1fed25-0c00-4b4e-ae49-3871f29375d6" containerName="extract-content" Nov 25 15:49:05 crc kubenswrapper[4704]: E1125 15:49:05.006440 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3649f81a-9887-4dba-91e7-66192abf74df" containerName="util" Nov 25 15:49:05 crc kubenswrapper[4704]: I1125 15:49:05.006447 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="3649f81a-9887-4dba-91e7-66192abf74df" containerName="util" Nov 25 15:49:05 crc kubenswrapper[4704]: E1125 15:49:05.006461 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d1fed25-0c00-4b4e-ae49-3871f29375d6" containerName="registry-server" Nov 25 15:49:05 crc kubenswrapper[4704]: I1125 15:49:05.006469 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d1fed25-0c00-4b4e-ae49-3871f29375d6" containerName="registry-server" Nov 25 15:49:05 crc kubenswrapper[4704]: I1125 15:49:05.006612 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="3649f81a-9887-4dba-91e7-66192abf74df" containerName="extract" Nov 25 15:49:05 crc kubenswrapper[4704]: I1125 15:49:05.006630 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d1fed25-0c00-4b4e-ae49-3871f29375d6" containerName="registry-server" Nov 25 15:49:05 crc kubenswrapper[4704]: I1125 15:49:05.007838 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z79ck" Nov 25 15:49:05 crc kubenswrapper[4704]: I1125 15:49:05.026000 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z79ck"] Nov 25 15:49:05 crc kubenswrapper[4704]: I1125 15:49:05.161766 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1085680-6aa6-461c-9c6b-3c4b1ed0a22d-utilities\") pod \"certified-operators-z79ck\" (UID: \"e1085680-6aa6-461c-9c6b-3c4b1ed0a22d\") " pod="openshift-marketplace/certified-operators-z79ck" Nov 25 15:49:05 crc kubenswrapper[4704]: I1125 15:49:05.161939 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1085680-6aa6-461c-9c6b-3c4b1ed0a22d-catalog-content\") pod \"certified-operators-z79ck\" (UID: \"e1085680-6aa6-461c-9c6b-3c4b1ed0a22d\") " pod="openshift-marketplace/certified-operators-z79ck" Nov 25 15:49:05 crc kubenswrapper[4704]: I1125 15:49:05.161970 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df4ct\" (UniqueName: \"kubernetes.io/projected/e1085680-6aa6-461c-9c6b-3c4b1ed0a22d-kube-api-access-df4ct\") pod \"certified-operators-z79ck\" (UID: \"e1085680-6aa6-461c-9c6b-3c4b1ed0a22d\") " pod="openshift-marketplace/certified-operators-z79ck" Nov 25 15:49:05 crc kubenswrapper[4704]: I1125 15:49:05.262897 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1085680-6aa6-461c-9c6b-3c4b1ed0a22d-utilities\") pod \"certified-operators-z79ck\" (UID: \"e1085680-6aa6-461c-9c6b-3c4b1ed0a22d\") " pod="openshift-marketplace/certified-operators-z79ck" Nov 25 15:49:05 crc kubenswrapper[4704]: I1125 15:49:05.262984 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df4ct\" (UniqueName: \"kubernetes.io/projected/e1085680-6aa6-461c-9c6b-3c4b1ed0a22d-kube-api-access-df4ct\") pod \"certified-operators-z79ck\" (UID: \"e1085680-6aa6-461c-9c6b-3c4b1ed0a22d\") " pod="openshift-marketplace/certified-operators-z79ck" Nov 25 15:49:05 crc kubenswrapper[4704]: I1125 15:49:05.263004 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1085680-6aa6-461c-9c6b-3c4b1ed0a22d-catalog-content\") pod \"certified-operators-z79ck\" (UID: \"e1085680-6aa6-461c-9c6b-3c4b1ed0a22d\") " pod="openshift-marketplace/certified-operators-z79ck" Nov 25 15:49:05 crc kubenswrapper[4704]: I1125 15:49:05.263474 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1085680-6aa6-461c-9c6b-3c4b1ed0a22d-utilities\") pod \"certified-operators-z79ck\" (UID: \"e1085680-6aa6-461c-9c6b-3c4b1ed0a22d\") " pod="openshift-marketplace/certified-operators-z79ck" Nov 25 15:49:05 crc kubenswrapper[4704]: I1125 15:49:05.263488 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1085680-6aa6-461c-9c6b-3c4b1ed0a22d-catalog-content\") pod \"certified-operators-z79ck\" (UID: \"e1085680-6aa6-461c-9c6b-3c4b1ed0a22d\") " pod="openshift-marketplace/certified-operators-z79ck" Nov 25 15:49:05 crc kubenswrapper[4704]: I1125 15:49:05.283228 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df4ct\" (UniqueName: \"kubernetes.io/projected/e1085680-6aa6-461c-9c6b-3c4b1ed0a22d-kube-api-access-df4ct\") pod \"certified-operators-z79ck\" (UID: \"e1085680-6aa6-461c-9c6b-3c4b1ed0a22d\") " pod="openshift-marketplace/certified-operators-z79ck" Nov 25 15:49:05 crc kubenswrapper[4704]: I1125 15:49:05.328546 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z79ck" Nov 25 15:49:05 crc kubenswrapper[4704]: I1125 15:49:05.335599 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56" event={"ID":"3649f81a-9887-4dba-91e7-66192abf74df","Type":"ContainerDied","Data":"e0f97e3e83f9bd1ab07524324692509f0025730b3880fd8e750a356315aae137"} Nov 25 15:49:05 crc kubenswrapper[4704]: I1125 15:49:05.335648 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0f97e3e83f9bd1ab07524324692509f0025730b3880fd8e750a356315aae137" Nov 25 15:49:05 crc kubenswrapper[4704]: I1125 15:49:05.335654 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56" Nov 25 15:49:05 crc kubenswrapper[4704]: I1125 15:49:05.661344 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z79ck"] Nov 25 15:49:06 crc kubenswrapper[4704]: I1125 15:49:06.343882 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z79ck" event={"ID":"e1085680-6aa6-461c-9c6b-3c4b1ed0a22d","Type":"ContainerStarted","Data":"da225cd4444a7a61cf51dfe780fe8a03601c0225bb59ca49d15dbbba88ed821a"} Nov 25 15:49:06 crc kubenswrapper[4704]: I1125 15:49:06.343948 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z79ck" event={"ID":"e1085680-6aa6-461c-9c6b-3c4b1ed0a22d","Type":"ContainerStarted","Data":"cf9baed6dcc9054ce86a985e5f47241de7120fb5d3d94cb75fc3d22e210c14ab"} Nov 25 15:49:07 crc kubenswrapper[4704]: I1125 15:49:07.351214 4704 generic.go:334] "Generic (PLEG): container finished" podID="e1085680-6aa6-461c-9c6b-3c4b1ed0a22d" containerID="da225cd4444a7a61cf51dfe780fe8a03601c0225bb59ca49d15dbbba88ed821a" exitCode=0 Nov 25 15:49:07 crc kubenswrapper[4704]: I1125 15:49:07.351605 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z79ck" event={"ID":"e1085680-6aa6-461c-9c6b-3c4b1ed0a22d","Type":"ContainerDied","Data":"da225cd4444a7a61cf51dfe780fe8a03601c0225bb59ca49d15dbbba88ed821a"} Nov 25 15:49:07 crc kubenswrapper[4704]: I1125 15:49:07.965128 4704 patch_prober.go:28] interesting pod/machine-config-daemon-djz8x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:49:07 crc kubenswrapper[4704]: I1125 15:49:07.965201 4704 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:49:08 crc kubenswrapper[4704]: I1125 15:49:08.359860 4704 generic.go:334] "Generic (PLEG): container finished" podID="e1085680-6aa6-461c-9c6b-3c4b1ed0a22d" containerID="7a3333fee0a354b7f924dde864bddae5bb6fa32291e5e510bbdecb1ad1b6be0b" exitCode=0 Nov 25 15:49:08 crc kubenswrapper[4704]: I1125 15:49:08.359918 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z79ck" event={"ID":"e1085680-6aa6-461c-9c6b-3c4b1ed0a22d","Type":"ContainerDied","Data":"7a3333fee0a354b7f924dde864bddae5bb6fa32291e5e510bbdecb1ad1b6be0b"} Nov 25 15:49:08 crc kubenswrapper[4704]: I1125 15:49:08.397029 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2b4qj"] Nov 25 15:49:08 crc kubenswrapper[4704]: I1125 15:49:08.397639 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2b4qj" podUID="2a0ed576-6fe0-4c40-8d66-344b8c18d75d" containerName="registry-server" containerID="cri-o://535e0d93ba35796c1cec49926100a9ada310cadfb2323d22ef704de7a5e23c24" gracePeriod=2 Nov 25 15:49:08 crc kubenswrapper[4704]: I1125 15:49:08.799468 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2b4qj" Nov 25 15:49:08 crc kubenswrapper[4704]: I1125 15:49:08.922681 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a0ed576-6fe0-4c40-8d66-344b8c18d75d-utilities\") pod \"2a0ed576-6fe0-4c40-8d66-344b8c18d75d\" (UID: \"2a0ed576-6fe0-4c40-8d66-344b8c18d75d\") " Nov 25 15:49:08 crc kubenswrapper[4704]: I1125 15:49:08.922827 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttqnm\" (UniqueName: \"kubernetes.io/projected/2a0ed576-6fe0-4c40-8d66-344b8c18d75d-kube-api-access-ttqnm\") pod \"2a0ed576-6fe0-4c40-8d66-344b8c18d75d\" (UID: \"2a0ed576-6fe0-4c40-8d66-344b8c18d75d\") " Nov 25 15:49:08 crc kubenswrapper[4704]: I1125 15:49:08.922940 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a0ed576-6fe0-4c40-8d66-344b8c18d75d-catalog-content\") pod \"2a0ed576-6fe0-4c40-8d66-344b8c18d75d\" (UID: \"2a0ed576-6fe0-4c40-8d66-344b8c18d75d\") " Nov 25 15:49:08 crc kubenswrapper[4704]: I1125 15:49:08.924155 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a0ed576-6fe0-4c40-8d66-344b8c18d75d-utilities" (OuterVolumeSpecName: "utilities") pod "2a0ed576-6fe0-4c40-8d66-344b8c18d75d" (UID: "2a0ed576-6fe0-4c40-8d66-344b8c18d75d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:49:08 crc kubenswrapper[4704]: I1125 15:49:08.934983 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a0ed576-6fe0-4c40-8d66-344b8c18d75d-kube-api-access-ttqnm" (OuterVolumeSpecName: "kube-api-access-ttqnm") pod "2a0ed576-6fe0-4c40-8d66-344b8c18d75d" (UID: "2a0ed576-6fe0-4c40-8d66-344b8c18d75d"). InnerVolumeSpecName "kube-api-access-ttqnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:49:08 crc kubenswrapper[4704]: I1125 15:49:08.983912 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a0ed576-6fe0-4c40-8d66-344b8c18d75d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a0ed576-6fe0-4c40-8d66-344b8c18d75d" (UID: "2a0ed576-6fe0-4c40-8d66-344b8c18d75d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:49:09 crc kubenswrapper[4704]: I1125 15:49:09.024101 4704 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a0ed576-6fe0-4c40-8d66-344b8c18d75d-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:49:09 crc kubenswrapper[4704]: I1125 15:49:09.024150 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttqnm\" (UniqueName: \"kubernetes.io/projected/2a0ed576-6fe0-4c40-8d66-344b8c18d75d-kube-api-access-ttqnm\") on node \"crc\" DevicePath \"\"" Nov 25 15:49:09 crc kubenswrapper[4704]: I1125 15:49:09.024160 4704 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a0ed576-6fe0-4c40-8d66-344b8c18d75d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:49:09 crc kubenswrapper[4704]: I1125 15:49:09.368739 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z79ck" event={"ID":"e1085680-6aa6-461c-9c6b-3c4b1ed0a22d","Type":"ContainerStarted","Data":"4882b6c4175f3a33580f2ae0f91e5cbcb5acea642bf8d5cc23c7cfe7380f4242"} Nov 25 15:49:09 crc kubenswrapper[4704]: I1125 15:49:09.371071 4704 generic.go:334] "Generic (PLEG): container finished" podID="2a0ed576-6fe0-4c40-8d66-344b8c18d75d" containerID="535e0d93ba35796c1cec49926100a9ada310cadfb2323d22ef704de7a5e23c24" exitCode=0 Nov 25 15:49:09 crc kubenswrapper[4704]: I1125 15:49:09.371115 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2b4qj" event={"ID":"2a0ed576-6fe0-4c40-8d66-344b8c18d75d","Type":"ContainerDied","Data":"535e0d93ba35796c1cec49926100a9ada310cadfb2323d22ef704de7a5e23c24"} Nov 25 15:49:09 crc kubenswrapper[4704]: I1125 15:49:09.371124 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2b4qj" Nov 25 15:49:09 crc kubenswrapper[4704]: I1125 15:49:09.371142 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2b4qj" event={"ID":"2a0ed576-6fe0-4c40-8d66-344b8c18d75d","Type":"ContainerDied","Data":"eccef833ae2a1ac66d44a0afb5e71c6df760fdd551f942d8d51f52423e7eb7a1"} Nov 25 15:49:09 crc kubenswrapper[4704]: I1125 15:49:09.371164 4704 scope.go:117] "RemoveContainer" containerID="535e0d93ba35796c1cec49926100a9ada310cadfb2323d22ef704de7a5e23c24" Nov 25 15:49:09 crc kubenswrapper[4704]: I1125 15:49:09.396075 4704 scope.go:117] "RemoveContainer" containerID="16c2b4b90e9576d5ce2af0365cc6f81f53a2604b8ed4769e13c575b44cdba823" Nov 25 15:49:09 crc kubenswrapper[4704]: I1125 15:49:09.435570 4704 scope.go:117] "RemoveContainer" containerID="5bfb4beb15a765c76bc393184d3e121c2b3787bfa86ebd004c8584cbe647f9cd" Nov 25 15:49:09 crc kubenswrapper[4704]: I1125 15:49:09.445008 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z79ck" podStartSLOduration=4.001674725 podStartE2EDuration="5.444983023s" podCreationTimestamp="2025-11-25 15:49:04 +0000 UTC" firstStartedPulling="2025-11-25 15:49:07.353436294 +0000 UTC m=+833.621710075" lastFinishedPulling="2025-11-25 15:49:08.796744592 +0000 UTC m=+835.065018373" observedRunningTime="2025-11-25 15:49:09.394915599 +0000 UTC m=+835.663189390" watchObservedRunningTime="2025-11-25 15:49:09.444983023 +0000 UTC m=+835.713256804" Nov 25 15:49:09 crc kubenswrapper[4704]: I1125 15:49:09.446095 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2b4qj"] Nov 25 15:49:09 crc kubenswrapper[4704]: I1125 15:49:09.453504 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2b4qj"] Nov 25 15:49:09 crc kubenswrapper[4704]: I1125 15:49:09.462833 4704 scope.go:117] "RemoveContainer" containerID="535e0d93ba35796c1cec49926100a9ada310cadfb2323d22ef704de7a5e23c24" Nov 25 15:49:09 crc kubenswrapper[4704]: E1125 15:49:09.464709 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"535e0d93ba35796c1cec49926100a9ada310cadfb2323d22ef704de7a5e23c24\": container with ID starting with 535e0d93ba35796c1cec49926100a9ada310cadfb2323d22ef704de7a5e23c24 not found: ID does not exist" containerID="535e0d93ba35796c1cec49926100a9ada310cadfb2323d22ef704de7a5e23c24" Nov 25 15:49:09 crc kubenswrapper[4704]: I1125 15:49:09.464762 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"535e0d93ba35796c1cec49926100a9ada310cadfb2323d22ef704de7a5e23c24"} err="failed to get container status \"535e0d93ba35796c1cec49926100a9ada310cadfb2323d22ef704de7a5e23c24\": rpc error: code = NotFound desc = could not find container \"535e0d93ba35796c1cec49926100a9ada310cadfb2323d22ef704de7a5e23c24\": container with ID starting with 535e0d93ba35796c1cec49926100a9ada310cadfb2323d22ef704de7a5e23c24 not found: ID does not exist" Nov 25 15:49:09 crc kubenswrapper[4704]: I1125 15:49:09.464809 4704 scope.go:117] "RemoveContainer" containerID="16c2b4b90e9576d5ce2af0365cc6f81f53a2604b8ed4769e13c575b44cdba823" Nov 25 15:49:09 crc kubenswrapper[4704]: E1125 15:49:09.465615 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16c2b4b90e9576d5ce2af0365cc6f81f53a2604b8ed4769e13c575b44cdba823\": container with ID starting with 16c2b4b90e9576d5ce2af0365cc6f81f53a2604b8ed4769e13c575b44cdba823 not found: ID does not exist" containerID="16c2b4b90e9576d5ce2af0365cc6f81f53a2604b8ed4769e13c575b44cdba823" Nov 25 15:49:09 crc kubenswrapper[4704]: I1125 15:49:09.465659 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16c2b4b90e9576d5ce2af0365cc6f81f53a2604b8ed4769e13c575b44cdba823"} err="failed to get container status \"16c2b4b90e9576d5ce2af0365cc6f81f53a2604b8ed4769e13c575b44cdba823\": rpc error: code = NotFound desc = could not find container \"16c2b4b90e9576d5ce2af0365cc6f81f53a2604b8ed4769e13c575b44cdba823\": container with ID starting with 16c2b4b90e9576d5ce2af0365cc6f81f53a2604b8ed4769e13c575b44cdba823 not found: ID does not exist" Nov 25 15:49:09 crc kubenswrapper[4704]: I1125 15:49:09.465683 4704 scope.go:117] "RemoveContainer" containerID="5bfb4beb15a765c76bc393184d3e121c2b3787bfa86ebd004c8584cbe647f9cd" Nov 25 15:49:09 crc kubenswrapper[4704]: E1125 15:49:09.466026 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bfb4beb15a765c76bc393184d3e121c2b3787bfa86ebd004c8584cbe647f9cd\": container with ID starting with 5bfb4beb15a765c76bc393184d3e121c2b3787bfa86ebd004c8584cbe647f9cd not found: ID does not exist" containerID="5bfb4beb15a765c76bc393184d3e121c2b3787bfa86ebd004c8584cbe647f9cd" Nov 25 15:49:09 crc kubenswrapper[4704]: I1125 15:49:09.466056 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bfb4beb15a765c76bc393184d3e121c2b3787bfa86ebd004c8584cbe647f9cd"} err="failed to get container status \"5bfb4beb15a765c76bc393184d3e121c2b3787bfa86ebd004c8584cbe647f9cd\": rpc error: code = NotFound desc = could not find container \"5bfb4beb15a765c76bc393184d3e121c2b3787bfa86ebd004c8584cbe647f9cd\": container with ID starting with 5bfb4beb15a765c76bc393184d3e121c2b3787bfa86ebd004c8584cbe647f9cd not found: ID does not exist" Nov 25 15:49:10 crc kubenswrapper[4704]: I1125 15:49:10.427951 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a0ed576-6fe0-4c40-8d66-344b8c18d75d" path="/var/lib/kubelet/pods/2a0ed576-6fe0-4c40-8d66-344b8c18d75d/volumes" Nov 25 15:49:11 crc kubenswrapper[4704]: I1125 15:49:11.980311 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c5bb565f-pcghm"] Nov 25 15:49:11 crc kubenswrapper[4704]: E1125 15:49:11.981085 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a0ed576-6fe0-4c40-8d66-344b8c18d75d" containerName="extract-content" Nov 25 15:49:11 crc kubenswrapper[4704]: I1125 15:49:11.981103 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a0ed576-6fe0-4c40-8d66-344b8c18d75d" containerName="extract-content" Nov 25 15:49:11 crc kubenswrapper[4704]: E1125 15:49:11.981116 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a0ed576-6fe0-4c40-8d66-344b8c18d75d" containerName="extract-utilities" Nov 25 15:49:11 crc kubenswrapper[4704]: I1125 15:49:11.981122 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a0ed576-6fe0-4c40-8d66-344b8c18d75d" containerName="extract-utilities" Nov 25 15:49:11 crc kubenswrapper[4704]: E1125 15:49:11.981131 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a0ed576-6fe0-4c40-8d66-344b8c18d75d" containerName="registry-server" Nov 25 15:49:11 crc kubenswrapper[4704]: I1125 15:49:11.981138 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a0ed576-6fe0-4c40-8d66-344b8c18d75d" containerName="registry-server" Nov 25 15:49:11 crc kubenswrapper[4704]: I1125 15:49:11.981263 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a0ed576-6fe0-4c40-8d66-344b8c18d75d" containerName="registry-server" Nov 25 15:49:11 crc kubenswrapper[4704]: I1125 15:49:11.981746 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-79c5bb565f-pcghm" Nov 25 15:49:11 crc kubenswrapper[4704]: I1125 15:49:11.985908 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Nov 25 15:49:11 crc kubenswrapper[4704]: I1125 15:49:11.987043 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Nov 25 15:49:11 crc kubenswrapper[4704]: I1125 15:49:11.991197 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-b9jfl" Nov 25 15:49:11 crc kubenswrapper[4704]: I1125 15:49:11.996054 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c5bb565f-pcghm"] Nov 25 15:49:12 crc kubenswrapper[4704]: I1125 15:49:12.082769 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2bd8f053-a003-406f-abd4-fabbee649785-apiservice-cert\") pod \"mariadb-operator-controller-manager-79c5bb565f-pcghm\" (UID: \"2bd8f053-a003-406f-abd4-fabbee649785\") " pod="openstack-operators/mariadb-operator-controller-manager-79c5bb565f-pcghm" Nov 25 15:49:12 crc kubenswrapper[4704]: I1125 15:49:12.083067 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4tnj\" (UniqueName: \"kubernetes.io/projected/2bd8f053-a003-406f-abd4-fabbee649785-kube-api-access-h4tnj\") pod \"mariadb-operator-controller-manager-79c5bb565f-pcghm\" (UID: \"2bd8f053-a003-406f-abd4-fabbee649785\") " pod="openstack-operators/mariadb-operator-controller-manager-79c5bb565f-pcghm" Nov 25 15:49:12 crc kubenswrapper[4704]: I1125 15:49:12.083178 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2bd8f053-a003-406f-abd4-fabbee649785-webhook-cert\") pod \"mariadb-operator-controller-manager-79c5bb565f-pcghm\" (UID: \"2bd8f053-a003-406f-abd4-fabbee649785\") " pod="openstack-operators/mariadb-operator-controller-manager-79c5bb565f-pcghm" Nov 25 15:49:12 crc kubenswrapper[4704]: I1125 15:49:12.184905 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4tnj\" (UniqueName: \"kubernetes.io/projected/2bd8f053-a003-406f-abd4-fabbee649785-kube-api-access-h4tnj\") pod \"mariadb-operator-controller-manager-79c5bb565f-pcghm\" (UID: \"2bd8f053-a003-406f-abd4-fabbee649785\") " pod="openstack-operators/mariadb-operator-controller-manager-79c5bb565f-pcghm" Nov 25 15:49:12 crc kubenswrapper[4704]: I1125 15:49:12.184980 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2bd8f053-a003-406f-abd4-fabbee649785-webhook-cert\") pod \"mariadb-operator-controller-manager-79c5bb565f-pcghm\" (UID: \"2bd8f053-a003-406f-abd4-fabbee649785\") " pod="openstack-operators/mariadb-operator-controller-manager-79c5bb565f-pcghm" Nov 25 15:49:12 crc kubenswrapper[4704]: I1125 15:49:12.185026 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2bd8f053-a003-406f-abd4-fabbee649785-apiservice-cert\") pod \"mariadb-operator-controller-manager-79c5bb565f-pcghm\" (UID: \"2bd8f053-a003-406f-abd4-fabbee649785\") " pod="openstack-operators/mariadb-operator-controller-manager-79c5bb565f-pcghm" Nov 25 15:49:12 crc kubenswrapper[4704]: I1125 15:49:12.194047 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2bd8f053-a003-406f-abd4-fabbee649785-apiservice-cert\") pod \"mariadb-operator-controller-manager-79c5bb565f-pcghm\" (UID: \"2bd8f053-a003-406f-abd4-fabbee649785\") " pod="openstack-operators/mariadb-operator-controller-manager-79c5bb565f-pcghm" Nov 25 15:49:12 crc kubenswrapper[4704]: I1125 15:49:12.209672 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4tnj\" (UniqueName: \"kubernetes.io/projected/2bd8f053-a003-406f-abd4-fabbee649785-kube-api-access-h4tnj\") pod \"mariadb-operator-controller-manager-79c5bb565f-pcghm\" (UID: \"2bd8f053-a003-406f-abd4-fabbee649785\") " pod="openstack-operators/mariadb-operator-controller-manager-79c5bb565f-pcghm" Nov 25 15:49:12 crc kubenswrapper[4704]: I1125 15:49:12.211023 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2bd8f053-a003-406f-abd4-fabbee649785-webhook-cert\") pod \"mariadb-operator-controller-manager-79c5bb565f-pcghm\" (UID: \"2bd8f053-a003-406f-abd4-fabbee649785\") " pod="openstack-operators/mariadb-operator-controller-manager-79c5bb565f-pcghm" Nov 25 15:49:12 crc kubenswrapper[4704]: I1125 15:49:12.301606 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-79c5bb565f-pcghm" Nov 25 15:49:12 crc kubenswrapper[4704]: I1125 15:49:12.753118 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c5bb565f-pcghm"] Nov 25 15:49:13 crc kubenswrapper[4704]: I1125 15:49:13.397325 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c5bb565f-pcghm" event={"ID":"2bd8f053-a003-406f-abd4-fabbee649785","Type":"ContainerStarted","Data":"6a1c115583901b11f632960b3a3ef75806bc10133e7b256103c08defacc1d368"} Nov 25 15:49:15 crc kubenswrapper[4704]: I1125 15:49:15.329381 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z79ck" Nov 25 15:49:15 crc kubenswrapper[4704]: I1125 15:49:15.330566 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z79ck" Nov 25 15:49:15 crc kubenswrapper[4704]: I1125 15:49:15.382767 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z79ck" Nov 25 15:49:15 crc kubenswrapper[4704]: I1125 15:49:15.466520 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z79ck" Nov 25 15:49:17 crc kubenswrapper[4704]: I1125 15:49:17.431460 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c5bb565f-pcghm" event={"ID":"2bd8f053-a003-406f-abd4-fabbee649785","Type":"ContainerStarted","Data":"417a734ac19e9d03c8a75a92727091273de8bd183589c19b16eb04e213a32b51"} Nov 25 15:49:17 crc kubenswrapper[4704]: I1125 15:49:17.432013 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-79c5bb565f-pcghm" Nov 25 15:49:17 crc kubenswrapper[4704]: I1125 15:49:17.450513 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-79c5bb565f-pcghm" podStartSLOduration=2.576594191 podStartE2EDuration="6.450486661s" podCreationTimestamp="2025-11-25 15:49:11 +0000 UTC" firstStartedPulling="2025-11-25 15:49:12.763737491 +0000 UTC m=+839.032011272" lastFinishedPulling="2025-11-25 15:49:16.637629961 +0000 UTC m=+842.905903742" observedRunningTime="2025-11-25 15:49:17.448302308 +0000 UTC m=+843.716576109" watchObservedRunningTime="2025-11-25 15:49:17.450486661 +0000 UTC m=+843.718760442" Nov 25 15:49:18 crc kubenswrapper[4704]: I1125 15:49:18.005919 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j72nh"] Nov 25 15:49:18 crc kubenswrapper[4704]: I1125 15:49:18.007433 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j72nh" Nov 25 15:49:18 crc kubenswrapper[4704]: I1125 15:49:18.017815 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j72nh"] Nov 25 15:49:18 crc kubenswrapper[4704]: I1125 15:49:18.072030 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvwch\" (UniqueName: \"kubernetes.io/projected/cb56e7bd-30e9-438a-98be-563480a92fca-kube-api-access-pvwch\") pod \"redhat-operators-j72nh\" (UID: \"cb56e7bd-30e9-438a-98be-563480a92fca\") " pod="openshift-marketplace/redhat-operators-j72nh" Nov 25 15:49:18 crc kubenswrapper[4704]: I1125 15:49:18.072112 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb56e7bd-30e9-438a-98be-563480a92fca-utilities\") pod \"redhat-operators-j72nh\" (UID: \"cb56e7bd-30e9-438a-98be-563480a92fca\") " pod="openshift-marketplace/redhat-operators-j72nh" Nov 25 15:49:18 crc kubenswrapper[4704]: I1125 15:49:18.072151 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb56e7bd-30e9-438a-98be-563480a92fca-catalog-content\") pod \"redhat-operators-j72nh\" (UID: \"cb56e7bd-30e9-438a-98be-563480a92fca\") " pod="openshift-marketplace/redhat-operators-j72nh" Nov 25 15:49:18 crc kubenswrapper[4704]: I1125 15:49:18.173456 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvwch\" (UniqueName: \"kubernetes.io/projected/cb56e7bd-30e9-438a-98be-563480a92fca-kube-api-access-pvwch\") pod \"redhat-operators-j72nh\" (UID: \"cb56e7bd-30e9-438a-98be-563480a92fca\") " pod="openshift-marketplace/redhat-operators-j72nh" Nov 25 15:49:18 crc kubenswrapper[4704]: I1125 15:49:18.173529 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb56e7bd-30e9-438a-98be-563480a92fca-utilities\") pod \"redhat-operators-j72nh\" (UID: \"cb56e7bd-30e9-438a-98be-563480a92fca\") " pod="openshift-marketplace/redhat-operators-j72nh" Nov 25 15:49:18 crc kubenswrapper[4704]: I1125 15:49:18.173556 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb56e7bd-30e9-438a-98be-563480a92fca-catalog-content\") pod \"redhat-operators-j72nh\" (UID: \"cb56e7bd-30e9-438a-98be-563480a92fca\") " pod="openshift-marketplace/redhat-operators-j72nh" Nov 25 15:49:18 crc kubenswrapper[4704]: I1125 15:49:18.174285 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb56e7bd-30e9-438a-98be-563480a92fca-catalog-content\") pod \"redhat-operators-j72nh\" (UID: \"cb56e7bd-30e9-438a-98be-563480a92fca\") " pod="openshift-marketplace/redhat-operators-j72nh" Nov 25 15:49:18 crc kubenswrapper[4704]: I1125 15:49:18.174299 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb56e7bd-30e9-438a-98be-563480a92fca-utilities\") pod \"redhat-operators-j72nh\" (UID: \"cb56e7bd-30e9-438a-98be-563480a92fca\") " pod="openshift-marketplace/redhat-operators-j72nh" Nov 25 15:49:18 crc kubenswrapper[4704]: I1125 15:49:18.196039 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvwch\" (UniqueName: \"kubernetes.io/projected/cb56e7bd-30e9-438a-98be-563480a92fca-kube-api-access-pvwch\") pod \"redhat-operators-j72nh\" (UID: \"cb56e7bd-30e9-438a-98be-563480a92fca\") " pod="openshift-marketplace/redhat-operators-j72nh" Nov 25 15:49:18 crc kubenswrapper[4704]: I1125 15:49:18.355883 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j72nh" Nov 25 15:49:18 crc kubenswrapper[4704]: I1125 15:49:18.809917 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j72nh"] Nov 25 15:49:18 crc kubenswrapper[4704]: I1125 15:49:18.998766 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z79ck"] Nov 25 15:49:18 crc kubenswrapper[4704]: I1125 15:49:18.999295 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-z79ck" podUID="e1085680-6aa6-461c-9c6b-3c4b1ed0a22d" containerName="registry-server" containerID="cri-o://4882b6c4175f3a33580f2ae0f91e5cbcb5acea642bf8d5cc23c7cfe7380f4242" gracePeriod=2 Nov 25 15:49:19 crc kubenswrapper[4704]: I1125 15:49:19.445595 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j72nh" event={"ID":"cb56e7bd-30e9-438a-98be-563480a92fca","Type":"ContainerStarted","Data":"02c1835f5b5f420b93b46d99aedd462614be8b9980e8953c138fa923f34dd2a4"} Nov 25 15:49:19 crc kubenswrapper[4704]: I1125 15:49:19.446057 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j72nh" event={"ID":"cb56e7bd-30e9-438a-98be-563480a92fca","Type":"ContainerStarted","Data":"55b07f3c8b5e0f013ed44ac294770f8d314f3798f37240dae63f2572032a2da4"} Nov 25 15:49:19 crc kubenswrapper[4704]: I1125 15:49:19.871158 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z79ck" Nov 25 15:49:19 crc kubenswrapper[4704]: I1125 15:49:19.995479 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-df4ct\" (UniqueName: \"kubernetes.io/projected/e1085680-6aa6-461c-9c6b-3c4b1ed0a22d-kube-api-access-df4ct\") pod \"e1085680-6aa6-461c-9c6b-3c4b1ed0a22d\" (UID: \"e1085680-6aa6-461c-9c6b-3c4b1ed0a22d\") " Nov 25 15:49:19 crc kubenswrapper[4704]: I1125 15:49:19.995615 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1085680-6aa6-461c-9c6b-3c4b1ed0a22d-utilities\") pod \"e1085680-6aa6-461c-9c6b-3c4b1ed0a22d\" (UID: \"e1085680-6aa6-461c-9c6b-3c4b1ed0a22d\") " Nov 25 15:49:19 crc kubenswrapper[4704]: I1125 15:49:19.995671 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1085680-6aa6-461c-9c6b-3c4b1ed0a22d-catalog-content\") pod \"e1085680-6aa6-461c-9c6b-3c4b1ed0a22d\" (UID: \"e1085680-6aa6-461c-9c6b-3c4b1ed0a22d\") " Nov 25 15:49:19 crc kubenswrapper[4704]: I1125 15:49:19.999586 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1085680-6aa6-461c-9c6b-3c4b1ed0a22d-utilities" (OuterVolumeSpecName: "utilities") pod "e1085680-6aa6-461c-9c6b-3c4b1ed0a22d" (UID: "e1085680-6aa6-461c-9c6b-3c4b1ed0a22d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:49:20 crc kubenswrapper[4704]: I1125 15:49:20.004468 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1085680-6aa6-461c-9c6b-3c4b1ed0a22d-kube-api-access-df4ct" (OuterVolumeSpecName: "kube-api-access-df4ct") pod "e1085680-6aa6-461c-9c6b-3c4b1ed0a22d" (UID: "e1085680-6aa6-461c-9c6b-3c4b1ed0a22d"). InnerVolumeSpecName "kube-api-access-df4ct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:49:20 crc kubenswrapper[4704]: I1125 15:49:20.046154 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1085680-6aa6-461c-9c6b-3c4b1ed0a22d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e1085680-6aa6-461c-9c6b-3c4b1ed0a22d" (UID: "e1085680-6aa6-461c-9c6b-3c4b1ed0a22d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:49:20 crc kubenswrapper[4704]: I1125 15:49:20.096830 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-df4ct\" (UniqueName: \"kubernetes.io/projected/e1085680-6aa6-461c-9c6b-3c4b1ed0a22d-kube-api-access-df4ct\") on node \"crc\" DevicePath \"\"" Nov 25 15:49:20 crc kubenswrapper[4704]: I1125 15:49:20.096882 4704 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1085680-6aa6-461c-9c6b-3c4b1ed0a22d-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:49:20 crc kubenswrapper[4704]: I1125 15:49:20.096898 4704 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1085680-6aa6-461c-9c6b-3c4b1ed0a22d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:49:20 crc kubenswrapper[4704]: I1125 15:49:20.455377 4704 generic.go:334] "Generic (PLEG): container finished" podID="e1085680-6aa6-461c-9c6b-3c4b1ed0a22d" containerID="4882b6c4175f3a33580f2ae0f91e5cbcb5acea642bf8d5cc23c7cfe7380f4242" exitCode=0 Nov 25 15:49:20 crc kubenswrapper[4704]: I1125 15:49:20.455471 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z79ck" event={"ID":"e1085680-6aa6-461c-9c6b-3c4b1ed0a22d","Type":"ContainerDied","Data":"4882b6c4175f3a33580f2ae0f91e5cbcb5acea642bf8d5cc23c7cfe7380f4242"} Nov 25 15:49:20 crc kubenswrapper[4704]: I1125 15:49:20.455501 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z79ck" event={"ID":"e1085680-6aa6-461c-9c6b-3c4b1ed0a22d","Type":"ContainerDied","Data":"cf9baed6dcc9054ce86a985e5f47241de7120fb5d3d94cb75fc3d22e210c14ab"} Nov 25 15:49:20 crc kubenswrapper[4704]: I1125 15:49:20.455539 4704 scope.go:117] "RemoveContainer" containerID="4882b6c4175f3a33580f2ae0f91e5cbcb5acea642bf8d5cc23c7cfe7380f4242" Nov 25 15:49:20 crc kubenswrapper[4704]: I1125 15:49:20.455716 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z79ck" Nov 25 15:49:20 crc kubenswrapper[4704]: I1125 15:49:20.458519 4704 generic.go:334] "Generic (PLEG): container finished" podID="cb56e7bd-30e9-438a-98be-563480a92fca" containerID="02c1835f5b5f420b93b46d99aedd462614be8b9980e8953c138fa923f34dd2a4" exitCode=0 Nov 25 15:49:20 crc kubenswrapper[4704]: I1125 15:49:20.458543 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j72nh" event={"ID":"cb56e7bd-30e9-438a-98be-563480a92fca","Type":"ContainerDied","Data":"02c1835f5b5f420b93b46d99aedd462614be8b9980e8953c138fa923f34dd2a4"} Nov 25 15:49:20 crc kubenswrapper[4704]: I1125 15:49:20.477181 4704 scope.go:117] "RemoveContainer" containerID="7a3333fee0a354b7f924dde864bddae5bb6fa32291e5e510bbdecb1ad1b6be0b" Nov 25 15:49:20 crc kubenswrapper[4704]: I1125 15:49:20.501688 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z79ck"] Nov 25 15:49:20 crc kubenswrapper[4704]: I1125 15:49:20.515870 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-z79ck"] Nov 25 15:49:20 crc kubenswrapper[4704]: I1125 15:49:20.520210 4704 scope.go:117] "RemoveContainer" containerID="da225cd4444a7a61cf51dfe780fe8a03601c0225bb59ca49d15dbbba88ed821a" Nov 25 15:49:20 crc kubenswrapper[4704]: I1125 15:49:20.557071 4704 scope.go:117] "RemoveContainer" containerID="4882b6c4175f3a33580f2ae0f91e5cbcb5acea642bf8d5cc23c7cfe7380f4242" Nov 25 15:49:20 crc kubenswrapper[4704]: E1125 15:49:20.559048 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4882b6c4175f3a33580f2ae0f91e5cbcb5acea642bf8d5cc23c7cfe7380f4242\": container with ID starting with 4882b6c4175f3a33580f2ae0f91e5cbcb5acea642bf8d5cc23c7cfe7380f4242 not found: ID does not exist" containerID="4882b6c4175f3a33580f2ae0f91e5cbcb5acea642bf8d5cc23c7cfe7380f4242" Nov 25 15:49:20 crc kubenswrapper[4704]: I1125 15:49:20.559090 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4882b6c4175f3a33580f2ae0f91e5cbcb5acea642bf8d5cc23c7cfe7380f4242"} err="failed to get container status \"4882b6c4175f3a33580f2ae0f91e5cbcb5acea642bf8d5cc23c7cfe7380f4242\": rpc error: code = NotFound desc = could not find container \"4882b6c4175f3a33580f2ae0f91e5cbcb5acea642bf8d5cc23c7cfe7380f4242\": container with ID starting with 4882b6c4175f3a33580f2ae0f91e5cbcb5acea642bf8d5cc23c7cfe7380f4242 not found: ID does not exist" Nov 25 15:49:20 crc kubenswrapper[4704]: I1125 15:49:20.559116 4704 scope.go:117] "RemoveContainer" containerID="7a3333fee0a354b7f924dde864bddae5bb6fa32291e5e510bbdecb1ad1b6be0b" Nov 25 15:49:20 crc kubenswrapper[4704]: E1125 15:49:20.561399 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a3333fee0a354b7f924dde864bddae5bb6fa32291e5e510bbdecb1ad1b6be0b\": container with ID starting with 7a3333fee0a354b7f924dde864bddae5bb6fa32291e5e510bbdecb1ad1b6be0b not found: ID does not exist" containerID="7a3333fee0a354b7f924dde864bddae5bb6fa32291e5e510bbdecb1ad1b6be0b" Nov 25 15:49:20 crc kubenswrapper[4704]: I1125 15:49:20.561444 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a3333fee0a354b7f924dde864bddae5bb6fa32291e5e510bbdecb1ad1b6be0b"} err="failed to get container status \"7a3333fee0a354b7f924dde864bddae5bb6fa32291e5e510bbdecb1ad1b6be0b\": rpc error: code = NotFound desc = could not find container \"7a3333fee0a354b7f924dde864bddae5bb6fa32291e5e510bbdecb1ad1b6be0b\": container with ID starting with 7a3333fee0a354b7f924dde864bddae5bb6fa32291e5e510bbdecb1ad1b6be0b not found: ID does not exist" Nov 25 15:49:20 crc kubenswrapper[4704]: I1125 15:49:20.561471 4704 scope.go:117] "RemoveContainer" containerID="da225cd4444a7a61cf51dfe780fe8a03601c0225bb59ca49d15dbbba88ed821a" Nov 25 15:49:20 crc kubenswrapper[4704]: E1125 15:49:20.561863 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da225cd4444a7a61cf51dfe780fe8a03601c0225bb59ca49d15dbbba88ed821a\": container with ID starting with da225cd4444a7a61cf51dfe780fe8a03601c0225bb59ca49d15dbbba88ed821a not found: ID does not exist" containerID="da225cd4444a7a61cf51dfe780fe8a03601c0225bb59ca49d15dbbba88ed821a" Nov 25 15:49:20 crc kubenswrapper[4704]: I1125 15:49:20.561891 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da225cd4444a7a61cf51dfe780fe8a03601c0225bb59ca49d15dbbba88ed821a"} err="failed to get container status \"da225cd4444a7a61cf51dfe780fe8a03601c0225bb59ca49d15dbbba88ed821a\": rpc error: code = NotFound desc = could not find container \"da225cd4444a7a61cf51dfe780fe8a03601c0225bb59ca49d15dbbba88ed821a\": container with ID starting with da225cd4444a7a61cf51dfe780fe8a03601c0225bb59ca49d15dbbba88ed821a not found: ID does not exist" Nov 25 15:49:22 crc kubenswrapper[4704]: I1125 15:49:22.306014 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-79c5bb565f-pcghm" Nov 25 15:49:22 crc kubenswrapper[4704]: I1125 15:49:22.422744 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1085680-6aa6-461c-9c6b-3c4b1ed0a22d" path="/var/lib/kubelet/pods/e1085680-6aa6-461c-9c6b-3c4b1ed0a22d/volumes" Nov 25 15:49:22 crc kubenswrapper[4704]: I1125 15:49:22.470349 4704 generic.go:334] "Generic (PLEG): container finished" podID="cb56e7bd-30e9-438a-98be-563480a92fca" containerID="32bb4baf3515a021395b7cd5ea95d7208b229d24e76711fc68d56facda31009d" exitCode=0 Nov 25 15:49:22 crc kubenswrapper[4704]: I1125 15:49:22.470394 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j72nh" event={"ID":"cb56e7bd-30e9-438a-98be-563480a92fca","Type":"ContainerDied","Data":"32bb4baf3515a021395b7cd5ea95d7208b229d24e76711fc68d56facda31009d"} Nov 25 15:49:23 crc kubenswrapper[4704]: I1125 15:49:23.478524 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j72nh" event={"ID":"cb56e7bd-30e9-438a-98be-563480a92fca","Type":"ContainerStarted","Data":"19dd8d52931dd3faa9c69dde34447ba908105144b1e91fdf7c31f3ebea6d0166"} Nov 25 15:49:23 crc kubenswrapper[4704]: I1125 15:49:23.503911 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j72nh" podStartSLOduration=4.082674899 podStartE2EDuration="6.503875056s" podCreationTimestamp="2025-11-25 15:49:17 +0000 UTC" firstStartedPulling="2025-11-25 15:49:20.460644536 +0000 UTC m=+846.728918317" lastFinishedPulling="2025-11-25 15:49:22.881844693 +0000 UTC m=+849.150118474" observedRunningTime="2025-11-25 15:49:23.502887298 +0000 UTC m=+849.771161099" watchObservedRunningTime="2025-11-25 15:49:23.503875056 +0000 UTC m=+849.772148847" Nov 25 15:49:28 crc kubenswrapper[4704]: I1125 15:49:28.356313 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j72nh" Nov 25 15:49:28 crc kubenswrapper[4704]: I1125 15:49:28.357412 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j72nh" Nov 25 15:49:28 crc kubenswrapper[4704]: I1125 15:49:28.406121 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j72nh" Nov 25 15:49:28 crc kubenswrapper[4704]: I1125 15:49:28.557809 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j72nh" Nov 25 15:49:30 crc kubenswrapper[4704]: I1125 15:49:30.801602 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-sdvpn"] Nov 25 15:49:30 crc kubenswrapper[4704]: E1125 15:49:30.802359 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1085680-6aa6-461c-9c6b-3c4b1ed0a22d" containerName="extract-content" Nov 25 15:49:30 crc kubenswrapper[4704]: I1125 15:49:30.802385 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1085680-6aa6-461c-9c6b-3c4b1ed0a22d" containerName="extract-content" Nov 25 15:49:30 crc kubenswrapper[4704]: E1125 15:49:30.802412 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1085680-6aa6-461c-9c6b-3c4b1ed0a22d" containerName="extract-utilities" Nov 25 15:49:30 crc kubenswrapper[4704]: I1125 15:49:30.802422 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1085680-6aa6-461c-9c6b-3c4b1ed0a22d" containerName="extract-utilities" Nov 25 15:49:30 crc kubenswrapper[4704]: E1125 15:49:30.802432 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1085680-6aa6-461c-9c6b-3c4b1ed0a22d" containerName="registry-server" Nov 25 15:49:30 crc kubenswrapper[4704]: I1125 15:49:30.802447 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1085680-6aa6-461c-9c6b-3c4b1ed0a22d" containerName="registry-server" Nov 25 15:49:30 crc kubenswrapper[4704]: I1125 15:49:30.802829 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1085680-6aa6-461c-9c6b-3c4b1ed0a22d" containerName="registry-server" Nov 25 15:49:30 crc kubenswrapper[4704]: I1125 15:49:30.804070 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-sdvpn" Nov 25 15:49:30 crc kubenswrapper[4704]: I1125 15:49:30.811578 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-wdzs8" Nov 25 15:49:30 crc kubenswrapper[4704]: I1125 15:49:30.819905 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-sdvpn"] Nov 25 15:49:30 crc kubenswrapper[4704]: I1125 15:49:30.857573 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w5bm\" (UniqueName: \"kubernetes.io/projected/5908a0b9-0b90-4ea4-a3f3-2a67f15fd3b5-kube-api-access-4w5bm\") pod \"infra-operator-index-sdvpn\" (UID: \"5908a0b9-0b90-4ea4-a3f3-2a67f15fd3b5\") " pod="openstack-operators/infra-operator-index-sdvpn" Nov 25 15:49:30 crc kubenswrapper[4704]: I1125 15:49:30.958911 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w5bm\" (UniqueName: \"kubernetes.io/projected/5908a0b9-0b90-4ea4-a3f3-2a67f15fd3b5-kube-api-access-4w5bm\") pod \"infra-operator-index-sdvpn\" (UID: \"5908a0b9-0b90-4ea4-a3f3-2a67f15fd3b5\") " pod="openstack-operators/infra-operator-index-sdvpn" Nov 25 15:49:30 crc kubenswrapper[4704]: I1125 15:49:30.980834 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w5bm\" (UniqueName: \"kubernetes.io/projected/5908a0b9-0b90-4ea4-a3f3-2a67f15fd3b5-kube-api-access-4w5bm\") pod \"infra-operator-index-sdvpn\" (UID: \"5908a0b9-0b90-4ea4-a3f3-2a67f15fd3b5\") " pod="openstack-operators/infra-operator-index-sdvpn" Nov 25 15:49:31 crc kubenswrapper[4704]: I1125 15:49:31.125039 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-sdvpn" Nov 25 15:49:31 crc kubenswrapper[4704]: I1125 15:49:31.560537 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-sdvpn"] Nov 25 15:49:32 crc kubenswrapper[4704]: I1125 15:49:32.539709 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-sdvpn" event={"ID":"5908a0b9-0b90-4ea4-a3f3-2a67f15fd3b5","Type":"ContainerStarted","Data":"ee86944f5340b9d7cf27b2504afbe46e7e36540006e0edb7133ae6d1567fc0a6"} Nov 25 15:49:33 crc kubenswrapper[4704]: I1125 15:49:33.545751 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-sdvpn" event={"ID":"5908a0b9-0b90-4ea4-a3f3-2a67f15fd3b5","Type":"ContainerStarted","Data":"685dd04e46890ee36dd16e84eb971ec6dc29c5896d1c97f4a2d89327ce604661"} Nov 25 15:49:36 crc kubenswrapper[4704]: I1125 15:49:36.392203 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-sdvpn" podStartSLOduration=5.419320317 podStartE2EDuration="6.392183022s" podCreationTimestamp="2025-11-25 15:49:30 +0000 UTC" firstStartedPulling="2025-11-25 15:49:31.569018073 +0000 UTC m=+857.837291854" lastFinishedPulling="2025-11-25 15:49:32.541880778 +0000 UTC m=+858.810154559" observedRunningTime="2025-11-25 15:49:33.563780532 +0000 UTC m=+859.832054313" watchObservedRunningTime="2025-11-25 15:49:36.392183022 +0000 UTC m=+862.660456813" Nov 25 15:49:36 crc kubenswrapper[4704]: I1125 15:49:36.395003 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j72nh"] Nov 25 15:49:36 crc kubenswrapper[4704]: I1125 15:49:36.395241 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j72nh" podUID="cb56e7bd-30e9-438a-98be-563480a92fca" containerName="registry-server" containerID="cri-o://19dd8d52931dd3faa9c69dde34447ba908105144b1e91fdf7c31f3ebea6d0166" gracePeriod=2 Nov 25 15:49:36 crc kubenswrapper[4704]: I1125 15:49:36.565153 4704 generic.go:334] "Generic (PLEG): container finished" podID="cb56e7bd-30e9-438a-98be-563480a92fca" containerID="19dd8d52931dd3faa9c69dde34447ba908105144b1e91fdf7c31f3ebea6d0166" exitCode=0 Nov 25 15:49:36 crc kubenswrapper[4704]: I1125 15:49:36.565206 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j72nh" event={"ID":"cb56e7bd-30e9-438a-98be-563480a92fca","Type":"ContainerDied","Data":"19dd8d52931dd3faa9c69dde34447ba908105144b1e91fdf7c31f3ebea6d0166"} Nov 25 15:49:37 crc kubenswrapper[4704]: I1125 15:49:37.965040 4704 patch_prober.go:28] interesting pod/machine-config-daemon-djz8x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:49:37 crc kubenswrapper[4704]: I1125 15:49:37.965449 4704 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:49:37 crc kubenswrapper[4704]: I1125 15:49:37.965504 4704 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" Nov 25 15:49:37 crc kubenswrapper[4704]: I1125 15:49:37.966176 4704 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"70c340a5598fd3ac0fcb6b9ef0ce0145e436d285d89d93a8b40ff742af895c50"} pod="openshift-machine-config-operator/machine-config-daemon-djz8x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 15:49:37 crc kubenswrapper[4704]: I1125 15:49:37.966232 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" containerID="cri-o://70c340a5598fd3ac0fcb6b9ef0ce0145e436d285d89d93a8b40ff742af895c50" gracePeriod=600 Nov 25 15:49:38 crc kubenswrapper[4704]: I1125 15:49:38.153473 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j72nh" Nov 25 15:49:38 crc kubenswrapper[4704]: I1125 15:49:38.352766 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb56e7bd-30e9-438a-98be-563480a92fca-catalog-content\") pod \"cb56e7bd-30e9-438a-98be-563480a92fca\" (UID: \"cb56e7bd-30e9-438a-98be-563480a92fca\") " Nov 25 15:49:38 crc kubenswrapper[4704]: I1125 15:49:38.352982 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb56e7bd-30e9-438a-98be-563480a92fca-utilities\") pod \"cb56e7bd-30e9-438a-98be-563480a92fca\" (UID: \"cb56e7bd-30e9-438a-98be-563480a92fca\") " Nov 25 15:49:38 crc kubenswrapper[4704]: I1125 15:49:38.353065 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvwch\" (UniqueName: \"kubernetes.io/projected/cb56e7bd-30e9-438a-98be-563480a92fca-kube-api-access-pvwch\") pod \"cb56e7bd-30e9-438a-98be-563480a92fca\" (UID: \"cb56e7bd-30e9-438a-98be-563480a92fca\") " Nov 25 15:49:38 crc kubenswrapper[4704]: I1125 15:49:38.353915 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb56e7bd-30e9-438a-98be-563480a92fca-utilities" (OuterVolumeSpecName: "utilities") pod "cb56e7bd-30e9-438a-98be-563480a92fca" (UID: "cb56e7bd-30e9-438a-98be-563480a92fca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:49:38 crc kubenswrapper[4704]: I1125 15:49:38.365695 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb56e7bd-30e9-438a-98be-563480a92fca-kube-api-access-pvwch" (OuterVolumeSpecName: "kube-api-access-pvwch") pod "cb56e7bd-30e9-438a-98be-563480a92fca" (UID: "cb56e7bd-30e9-438a-98be-563480a92fca"). InnerVolumeSpecName "kube-api-access-pvwch". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:49:38 crc kubenswrapper[4704]: I1125 15:49:38.439503 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb56e7bd-30e9-438a-98be-563480a92fca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb56e7bd-30e9-438a-98be-563480a92fca" (UID: "cb56e7bd-30e9-438a-98be-563480a92fca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:49:38 crc kubenswrapper[4704]: I1125 15:49:38.453907 4704 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb56e7bd-30e9-438a-98be-563480a92fca-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:49:38 crc kubenswrapper[4704]: I1125 15:49:38.453941 4704 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb56e7bd-30e9-438a-98be-563480a92fca-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:49:38 crc kubenswrapper[4704]: I1125 15:49:38.453956 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvwch\" (UniqueName: \"kubernetes.io/projected/cb56e7bd-30e9-438a-98be-563480a92fca-kube-api-access-pvwch\") on node \"crc\" DevicePath \"\"" Nov 25 15:49:38 crc kubenswrapper[4704]: I1125 15:49:38.583248 4704 generic.go:334] "Generic (PLEG): container finished" podID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerID="70c340a5598fd3ac0fcb6b9ef0ce0145e436d285d89d93a8b40ff742af895c50" exitCode=0 Nov 25 15:49:38 crc kubenswrapper[4704]: I1125 15:49:38.583316 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" event={"ID":"91b52682-d008-4b8a-8bc3-26b032d7dc2c","Type":"ContainerDied","Data":"70c340a5598fd3ac0fcb6b9ef0ce0145e436d285d89d93a8b40ff742af895c50"} Nov 25 15:49:38 crc kubenswrapper[4704]: I1125 15:49:38.583411 4704 scope.go:117] "RemoveContainer" containerID="491da08c31f2f2cd2745fd9b52997ec5a66034a8d558b6b85cbfececf99b972a" Nov 25 15:49:38 crc kubenswrapper[4704]: I1125 15:49:38.585498 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j72nh" event={"ID":"cb56e7bd-30e9-438a-98be-563480a92fca","Type":"ContainerDied","Data":"55b07f3c8b5e0f013ed44ac294770f8d314f3798f37240dae63f2572032a2da4"} Nov 25 15:49:38 crc kubenswrapper[4704]: I1125 15:49:38.585551 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j72nh" Nov 25 15:49:38 crc kubenswrapper[4704]: I1125 15:49:38.618047 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j72nh"] Nov 25 15:49:38 crc kubenswrapper[4704]: I1125 15:49:38.621447 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j72nh"] Nov 25 15:49:38 crc kubenswrapper[4704]: I1125 15:49:38.730419 4704 scope.go:117] "RemoveContainer" containerID="19dd8d52931dd3faa9c69dde34447ba908105144b1e91fdf7c31f3ebea6d0166" Nov 25 15:49:38 crc kubenswrapper[4704]: I1125 15:49:38.754753 4704 scope.go:117] "RemoveContainer" containerID="32bb4baf3515a021395b7cd5ea95d7208b229d24e76711fc68d56facda31009d" Nov 25 15:49:38 crc kubenswrapper[4704]: I1125 15:49:38.775549 4704 scope.go:117] "RemoveContainer" containerID="02c1835f5b5f420b93b46d99aedd462614be8b9980e8953c138fa923f34dd2a4" Nov 25 15:49:39 crc kubenswrapper[4704]: I1125 15:49:39.594860 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" event={"ID":"91b52682-d008-4b8a-8bc3-26b032d7dc2c","Type":"ContainerStarted","Data":"0a8966b76dc1d40a4bda67fc26f25a19803f2f36d74b3a7ae6b45d74acb00ad9"} Nov 25 15:49:40 crc kubenswrapper[4704]: I1125 15:49:40.423938 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb56e7bd-30e9-438a-98be-563480a92fca" path="/var/lib/kubelet/pods/cb56e7bd-30e9-438a-98be-563480a92fca/volumes" Nov 25 15:49:41 crc kubenswrapper[4704]: I1125 15:49:41.126194 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-sdvpn" Nov 25 15:49:41 crc kubenswrapper[4704]: I1125 15:49:41.126728 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-sdvpn" Nov 25 15:49:41 crc kubenswrapper[4704]: I1125 15:49:41.159434 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-sdvpn" Nov 25 15:49:41 crc kubenswrapper[4704]: I1125 15:49:41.634098 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-sdvpn" Nov 25 15:49:43 crc kubenswrapper[4704]: I1125 15:49:43.037814 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s"] Nov 25 15:49:43 crc kubenswrapper[4704]: E1125 15:49:43.038053 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb56e7bd-30e9-438a-98be-563480a92fca" containerName="extract-content" Nov 25 15:49:43 crc kubenswrapper[4704]: I1125 15:49:43.038089 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb56e7bd-30e9-438a-98be-563480a92fca" containerName="extract-content" Nov 25 15:49:43 crc kubenswrapper[4704]: E1125 15:49:43.038103 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb56e7bd-30e9-438a-98be-563480a92fca" containerName="extract-utilities" Nov 25 15:49:43 crc kubenswrapper[4704]: I1125 15:49:43.038109 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb56e7bd-30e9-438a-98be-563480a92fca" containerName="extract-utilities" Nov 25 15:49:43 crc kubenswrapper[4704]: E1125 15:49:43.038120 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb56e7bd-30e9-438a-98be-563480a92fca" containerName="registry-server" Nov 25 15:49:43 crc kubenswrapper[4704]: I1125 15:49:43.038125 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb56e7bd-30e9-438a-98be-563480a92fca" containerName="registry-server" Nov 25 15:49:43 crc kubenswrapper[4704]: I1125 15:49:43.038258 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb56e7bd-30e9-438a-98be-563480a92fca" containerName="registry-server" Nov 25 15:49:43 crc kubenswrapper[4704]: I1125 15:49:43.039029 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s" Nov 25 15:49:43 crc kubenswrapper[4704]: I1125 15:49:43.041276 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-8zdtm" Nov 25 15:49:43 crc kubenswrapper[4704]: I1125 15:49:43.050673 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s"] Nov 25 15:49:43 crc kubenswrapper[4704]: I1125 15:49:43.216254 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25f90ba3-d788-402c-9d77-5653b1b3bed6-util\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s\" (UID: \"25f90ba3-d788-402c-9d77-5653b1b3bed6\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s" Nov 25 15:49:43 crc kubenswrapper[4704]: I1125 15:49:43.216309 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsjgc\" (UniqueName: \"kubernetes.io/projected/25f90ba3-d788-402c-9d77-5653b1b3bed6-kube-api-access-nsjgc\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s\" (UID: \"25f90ba3-d788-402c-9d77-5653b1b3bed6\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s" Nov 25 15:49:43 crc kubenswrapper[4704]: I1125 15:49:43.216352 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25f90ba3-d788-402c-9d77-5653b1b3bed6-bundle\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s\" (UID: \"25f90ba3-d788-402c-9d77-5653b1b3bed6\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s" Nov 25 15:49:43 crc kubenswrapper[4704]: I1125 15:49:43.317127 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsjgc\" (UniqueName: \"kubernetes.io/projected/25f90ba3-d788-402c-9d77-5653b1b3bed6-kube-api-access-nsjgc\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s\" (UID: \"25f90ba3-d788-402c-9d77-5653b1b3bed6\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s" Nov 25 15:49:43 crc kubenswrapper[4704]: I1125 15:49:43.317202 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25f90ba3-d788-402c-9d77-5653b1b3bed6-bundle\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s\" (UID: \"25f90ba3-d788-402c-9d77-5653b1b3bed6\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s" Nov 25 15:49:43 crc kubenswrapper[4704]: I1125 15:49:43.317295 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25f90ba3-d788-402c-9d77-5653b1b3bed6-util\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s\" (UID: \"25f90ba3-d788-402c-9d77-5653b1b3bed6\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s" Nov 25 15:49:43 crc kubenswrapper[4704]: I1125 15:49:43.317816 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25f90ba3-d788-402c-9d77-5653b1b3bed6-bundle\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s\" (UID: \"25f90ba3-d788-402c-9d77-5653b1b3bed6\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s" Nov 25 15:49:43 crc kubenswrapper[4704]: I1125 15:49:43.317851 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25f90ba3-d788-402c-9d77-5653b1b3bed6-util\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s\" (UID: \"25f90ba3-d788-402c-9d77-5653b1b3bed6\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s" Nov 25 15:49:43 crc kubenswrapper[4704]: I1125 15:49:43.338108 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsjgc\" (UniqueName: \"kubernetes.io/projected/25f90ba3-d788-402c-9d77-5653b1b3bed6-kube-api-access-nsjgc\") pod \"5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s\" (UID: \"25f90ba3-d788-402c-9d77-5653b1b3bed6\") " pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s" Nov 25 15:49:43 crc kubenswrapper[4704]: I1125 15:49:43.358960 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s" Nov 25 15:49:43 crc kubenswrapper[4704]: I1125 15:49:43.765925 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s"] Nov 25 15:49:43 crc kubenswrapper[4704]: W1125 15:49:43.775231 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25f90ba3_d788_402c_9d77_5653b1b3bed6.slice/crio-26fe242f38fbfacfdd5e91793680807dd4a1030f3e8e7cf39509d28be99253bb WatchSource:0}: Error finding container 26fe242f38fbfacfdd5e91793680807dd4a1030f3e8e7cf39509d28be99253bb: Status 404 returned error can't find the container with id 26fe242f38fbfacfdd5e91793680807dd4a1030f3e8e7cf39509d28be99253bb Nov 25 15:49:44 crc kubenswrapper[4704]: I1125 15:49:44.629035 4704 generic.go:334] "Generic (PLEG): container finished" podID="25f90ba3-d788-402c-9d77-5653b1b3bed6" containerID="bb7813f680d754e68ffabc3b0a1af0142f63a5c48d7a952f4fe1a52d22cb2292" exitCode=0 Nov 25 15:49:44 crc kubenswrapper[4704]: I1125 15:49:44.629105 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s" event={"ID":"25f90ba3-d788-402c-9d77-5653b1b3bed6","Type":"ContainerDied","Data":"bb7813f680d754e68ffabc3b0a1af0142f63a5c48d7a952f4fe1a52d22cb2292"} Nov 25 15:49:44 crc kubenswrapper[4704]: I1125 15:49:44.629664 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s" event={"ID":"25f90ba3-d788-402c-9d77-5653b1b3bed6","Type":"ContainerStarted","Data":"26fe242f38fbfacfdd5e91793680807dd4a1030f3e8e7cf39509d28be99253bb"} Nov 25 15:49:45 crc kubenswrapper[4704]: I1125 15:49:45.637280 4704 generic.go:334] "Generic (PLEG): container finished" podID="25f90ba3-d788-402c-9d77-5653b1b3bed6" containerID="4f7dacb59c8c76cebc96f3350e6f0afb21d93129f4a128ad8f5d6ed4758a5ac6" exitCode=0 Nov 25 15:49:45 crc kubenswrapper[4704]: I1125 15:49:45.637386 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s" event={"ID":"25f90ba3-d788-402c-9d77-5653b1b3bed6","Type":"ContainerDied","Data":"4f7dacb59c8c76cebc96f3350e6f0afb21d93129f4a128ad8f5d6ed4758a5ac6"} Nov 25 15:49:46 crc kubenswrapper[4704]: I1125 15:49:46.644368 4704 generic.go:334] "Generic (PLEG): container finished" podID="25f90ba3-d788-402c-9d77-5653b1b3bed6" containerID="fe43fe5a780c384b617d61c7e2f2898c6b881c2e8097efb9e61b05c6e85e6ef3" exitCode=0 Nov 25 15:49:46 crc kubenswrapper[4704]: I1125 15:49:46.644415 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s" event={"ID":"25f90ba3-d788-402c-9d77-5653b1b3bed6","Type":"ContainerDied","Data":"fe43fe5a780c384b617d61c7e2f2898c6b881c2e8097efb9e61b05c6e85e6ef3"} Nov 25 15:49:47 crc kubenswrapper[4704]: I1125 15:49:47.914649 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s" Nov 25 15:49:48 crc kubenswrapper[4704]: I1125 15:49:48.081983 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25f90ba3-d788-402c-9d77-5653b1b3bed6-bundle\") pod \"25f90ba3-d788-402c-9d77-5653b1b3bed6\" (UID: \"25f90ba3-d788-402c-9d77-5653b1b3bed6\") " Nov 25 15:49:48 crc kubenswrapper[4704]: I1125 15:49:48.082533 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsjgc\" (UniqueName: \"kubernetes.io/projected/25f90ba3-d788-402c-9d77-5653b1b3bed6-kube-api-access-nsjgc\") pod \"25f90ba3-d788-402c-9d77-5653b1b3bed6\" (UID: \"25f90ba3-d788-402c-9d77-5653b1b3bed6\") " Nov 25 15:49:48 crc kubenswrapper[4704]: I1125 15:49:48.082597 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25f90ba3-d788-402c-9d77-5653b1b3bed6-util\") pod \"25f90ba3-d788-402c-9d77-5653b1b3bed6\" (UID: \"25f90ba3-d788-402c-9d77-5653b1b3bed6\") " Nov 25 15:49:48 crc kubenswrapper[4704]: I1125 15:49:48.083188 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25f90ba3-d788-402c-9d77-5653b1b3bed6-bundle" (OuterVolumeSpecName: "bundle") pod "25f90ba3-d788-402c-9d77-5653b1b3bed6" (UID: "25f90ba3-d788-402c-9d77-5653b1b3bed6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:49:48 crc kubenswrapper[4704]: I1125 15:49:48.094137 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25f90ba3-d788-402c-9d77-5653b1b3bed6-kube-api-access-nsjgc" (OuterVolumeSpecName: "kube-api-access-nsjgc") pod "25f90ba3-d788-402c-9d77-5653b1b3bed6" (UID: "25f90ba3-d788-402c-9d77-5653b1b3bed6"). InnerVolumeSpecName "kube-api-access-nsjgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:49:48 crc kubenswrapper[4704]: I1125 15:49:48.097819 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25f90ba3-d788-402c-9d77-5653b1b3bed6-util" (OuterVolumeSpecName: "util") pod "25f90ba3-d788-402c-9d77-5653b1b3bed6" (UID: "25f90ba3-d788-402c-9d77-5653b1b3bed6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:49:48 crc kubenswrapper[4704]: I1125 15:49:48.184472 4704 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25f90ba3-d788-402c-9d77-5653b1b3bed6-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:49:48 crc kubenswrapper[4704]: I1125 15:49:48.184532 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsjgc\" (UniqueName: \"kubernetes.io/projected/25f90ba3-d788-402c-9d77-5653b1b3bed6-kube-api-access-nsjgc\") on node \"crc\" DevicePath \"\"" Nov 25 15:49:48 crc kubenswrapper[4704]: I1125 15:49:48.184547 4704 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25f90ba3-d788-402c-9d77-5653b1b3bed6-util\") on node \"crc\" DevicePath \"\"" Nov 25 15:49:48 crc kubenswrapper[4704]: I1125 15:49:48.659469 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s" event={"ID":"25f90ba3-d788-402c-9d77-5653b1b3bed6","Type":"ContainerDied","Data":"26fe242f38fbfacfdd5e91793680807dd4a1030f3e8e7cf39509d28be99253bb"} Nov 25 15:49:48 crc kubenswrapper[4704]: I1125 15:49:48.659559 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26fe242f38fbfacfdd5e91793680807dd4a1030f3e8e7cf39509d28be99253bb" Nov 25 15:49:48 crc kubenswrapper[4704]: I1125 15:49:48.659567 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s" Nov 25 15:49:55 crc kubenswrapper[4704]: I1125 15:49:55.907842 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-67cd7d6948-7k7tj"] Nov 25 15:49:55 crc kubenswrapper[4704]: E1125 15:49:55.909033 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25f90ba3-d788-402c-9d77-5653b1b3bed6" containerName="extract" Nov 25 15:49:55 crc kubenswrapper[4704]: I1125 15:49:55.909050 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="25f90ba3-d788-402c-9d77-5653b1b3bed6" containerName="extract" Nov 25 15:49:55 crc kubenswrapper[4704]: E1125 15:49:55.909076 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25f90ba3-d788-402c-9d77-5653b1b3bed6" containerName="util" Nov 25 15:49:55 crc kubenswrapper[4704]: I1125 15:49:55.909085 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="25f90ba3-d788-402c-9d77-5653b1b3bed6" containerName="util" Nov 25 15:49:55 crc kubenswrapper[4704]: E1125 15:49:55.909096 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25f90ba3-d788-402c-9d77-5653b1b3bed6" containerName="pull" Nov 25 15:49:55 crc kubenswrapper[4704]: I1125 15:49:55.909103 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="25f90ba3-d788-402c-9d77-5653b1b3bed6" containerName="pull" Nov 25 15:49:55 crc kubenswrapper[4704]: I1125 15:49:55.909232 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="25f90ba3-d788-402c-9d77-5653b1b3bed6" containerName="extract" Nov 25 15:49:55 crc kubenswrapper[4704]: I1125 15:49:55.909941 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-67cd7d6948-7k7tj" Nov 25 15:49:55 crc kubenswrapper[4704]: I1125 15:49:55.912665 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-h8wfq" Nov 25 15:49:55 crc kubenswrapper[4704]: I1125 15:49:55.916142 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Nov 25 15:49:55 crc kubenswrapper[4704]: I1125 15:49:55.930060 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-67cd7d6948-7k7tj"] Nov 25 15:49:55 crc kubenswrapper[4704]: I1125 15:49:55.986236 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbc74\" (UniqueName: \"kubernetes.io/projected/8f214e62-ec20-41c1-835c-1daab12028a0-kube-api-access-qbc74\") pod \"infra-operator-controller-manager-67cd7d6948-7k7tj\" (UID: \"8f214e62-ec20-41c1-835c-1daab12028a0\") " pod="openstack-operators/infra-operator-controller-manager-67cd7d6948-7k7tj" Nov 25 15:49:55 crc kubenswrapper[4704]: I1125 15:49:55.986408 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8f214e62-ec20-41c1-835c-1daab12028a0-apiservice-cert\") pod \"infra-operator-controller-manager-67cd7d6948-7k7tj\" (UID: \"8f214e62-ec20-41c1-835c-1daab12028a0\") " pod="openstack-operators/infra-operator-controller-manager-67cd7d6948-7k7tj" Nov 25 15:49:55 crc kubenswrapper[4704]: I1125 15:49:55.986672 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8f214e62-ec20-41c1-835c-1daab12028a0-webhook-cert\") pod \"infra-operator-controller-manager-67cd7d6948-7k7tj\" (UID: \"8f214e62-ec20-41c1-835c-1daab12028a0\") " pod="openstack-operators/infra-operator-controller-manager-67cd7d6948-7k7tj" Nov 25 15:49:56 crc kubenswrapper[4704]: I1125 15:49:56.088107 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8f214e62-ec20-41c1-835c-1daab12028a0-webhook-cert\") pod \"infra-operator-controller-manager-67cd7d6948-7k7tj\" (UID: \"8f214e62-ec20-41c1-835c-1daab12028a0\") " pod="openstack-operators/infra-operator-controller-manager-67cd7d6948-7k7tj" Nov 25 15:49:56 crc kubenswrapper[4704]: I1125 15:49:56.088276 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbc74\" (UniqueName: \"kubernetes.io/projected/8f214e62-ec20-41c1-835c-1daab12028a0-kube-api-access-qbc74\") pod \"infra-operator-controller-manager-67cd7d6948-7k7tj\" (UID: \"8f214e62-ec20-41c1-835c-1daab12028a0\") " pod="openstack-operators/infra-operator-controller-manager-67cd7d6948-7k7tj" Nov 25 15:49:56 crc kubenswrapper[4704]: I1125 15:49:56.088327 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8f214e62-ec20-41c1-835c-1daab12028a0-apiservice-cert\") pod \"infra-operator-controller-manager-67cd7d6948-7k7tj\" (UID: \"8f214e62-ec20-41c1-835c-1daab12028a0\") " pod="openstack-operators/infra-operator-controller-manager-67cd7d6948-7k7tj" Nov 25 15:49:56 crc kubenswrapper[4704]: I1125 15:49:56.095362 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8f214e62-ec20-41c1-835c-1daab12028a0-apiservice-cert\") pod \"infra-operator-controller-manager-67cd7d6948-7k7tj\" (UID: \"8f214e62-ec20-41c1-835c-1daab12028a0\") " pod="openstack-operators/infra-operator-controller-manager-67cd7d6948-7k7tj" Nov 25 15:49:56 crc kubenswrapper[4704]: I1125 15:49:56.106870 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8f214e62-ec20-41c1-835c-1daab12028a0-webhook-cert\") pod \"infra-operator-controller-manager-67cd7d6948-7k7tj\" (UID: \"8f214e62-ec20-41c1-835c-1daab12028a0\") " pod="openstack-operators/infra-operator-controller-manager-67cd7d6948-7k7tj" Nov 25 15:49:56 crc kubenswrapper[4704]: I1125 15:49:56.110153 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbc74\" (UniqueName: \"kubernetes.io/projected/8f214e62-ec20-41c1-835c-1daab12028a0-kube-api-access-qbc74\") pod \"infra-operator-controller-manager-67cd7d6948-7k7tj\" (UID: \"8f214e62-ec20-41c1-835c-1daab12028a0\") " pod="openstack-operators/infra-operator-controller-manager-67cd7d6948-7k7tj" Nov 25 15:49:56 crc kubenswrapper[4704]: I1125 15:49:56.233146 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-67cd7d6948-7k7tj" Nov 25 15:49:56 crc kubenswrapper[4704]: I1125 15:49:56.480382 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-67cd7d6948-7k7tj"] Nov 25 15:49:56 crc kubenswrapper[4704]: I1125 15:49:56.740588 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-67cd7d6948-7k7tj" event={"ID":"8f214e62-ec20-41c1-835c-1daab12028a0","Type":"ContainerStarted","Data":"18240ee50f58d9206409ef63f9e9360de12856f9b18f7007e09346bbe49a4321"} Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.445053 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.447319 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.451038 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config-data" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.451300 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"galera-openstack-dockercfg-pln7d" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.451417 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.451518 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openshift-service-ca.crt" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.451666 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"kube-root-ca.crt" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.457812 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.459023 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.471707 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.482870 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.483983 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.495092 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.500945 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.516744 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65796556-fea5-482e-a4e8-883f027c30ba-operator-scripts\") pod \"openstack-galera-2\" (UID: \"65796556-fea5-482e-a4e8-883f027c30ba\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.517117 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4067b873-5563-4077-b020-6464d703ddd4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4067b873-5563-4077-b020-6464d703ddd4\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.517199 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"4067b873-5563-4077-b020-6464d703ddd4\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.517262 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/65796556-fea5-482e-a4e8-883f027c30ba-kolla-config\") pod \"openstack-galera-2\" (UID: \"65796556-fea5-482e-a4e8-883f027c30ba\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.517460 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4067b873-5563-4077-b020-6464d703ddd4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4067b873-5563-4077-b020-6464d703ddd4\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.517548 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-2\" (UID: \"65796556-fea5-482e-a4e8-883f027c30ba\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.517658 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbrz7\" (UniqueName: \"kubernetes.io/projected/4067b873-5563-4077-b020-6464d703ddd4-kube-api-access-gbrz7\") pod \"openstack-galera-0\" (UID: \"4067b873-5563-4077-b020-6464d703ddd4\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.517721 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/65796556-fea5-482e-a4e8-883f027c30ba-config-data-default\") pod \"openstack-galera-2\" (UID: \"65796556-fea5-482e-a4e8-883f027c30ba\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.518053 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4067b873-5563-4077-b020-6464d703ddd4-config-data-default\") pod \"openstack-galera-0\" (UID: \"4067b873-5563-4077-b020-6464d703ddd4\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.519287 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4067b873-5563-4077-b020-6464d703ddd4-kolla-config\") pod \"openstack-galera-0\" (UID: \"4067b873-5563-4077-b020-6464d703ddd4\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.519777 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/65796556-fea5-482e-a4e8-883f027c30ba-config-data-generated\") pod \"openstack-galera-2\" (UID: \"65796556-fea5-482e-a4e8-883f027c30ba\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.519939 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsjvd\" (UniqueName: \"kubernetes.io/projected/65796556-fea5-482e-a4e8-883f027c30ba-kube-api-access-gsjvd\") pod \"openstack-galera-2\" (UID: \"65796556-fea5-482e-a4e8-883f027c30ba\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.621505 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/65796556-fea5-482e-a4e8-883f027c30ba-config-data-generated\") pod \"openstack-galera-2\" (UID: \"65796556-fea5-482e-a4e8-883f027c30ba\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.621575 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/18b12599-9af6-4deb-8943-c8048a44a236-kolla-config\") pod \"openstack-galera-1\" (UID: \"18b12599-9af6-4deb-8943-c8048a44a236\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.621606 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/18b12599-9af6-4deb-8943-c8048a44a236-config-data-generated\") pod \"openstack-galera-1\" (UID: \"18b12599-9af6-4deb-8943-c8048a44a236\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.621634 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsjvd\" (UniqueName: \"kubernetes.io/projected/65796556-fea5-482e-a4e8-883f027c30ba-kube-api-access-gsjvd\") pod \"openstack-galera-2\" (UID: \"65796556-fea5-482e-a4e8-883f027c30ba\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.621672 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4067b873-5563-4077-b020-6464d703ddd4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4067b873-5563-4077-b020-6464d703ddd4\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.621693 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65796556-fea5-482e-a4e8-883f027c30ba-operator-scripts\") pod \"openstack-galera-2\" (UID: \"65796556-fea5-482e-a4e8-883f027c30ba\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.621718 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-1\" (UID: \"18b12599-9af6-4deb-8943-c8048a44a236\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.621844 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"4067b873-5563-4077-b020-6464d703ddd4\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.621891 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/65796556-fea5-482e-a4e8-883f027c30ba-kolla-config\") pod \"openstack-galera-2\" (UID: \"65796556-fea5-482e-a4e8-883f027c30ba\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.622013 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4067b873-5563-4077-b020-6464d703ddd4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4067b873-5563-4077-b020-6464d703ddd4\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.622043 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-2\" (UID: \"65796556-fea5-482e-a4e8-883f027c30ba\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.622074 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/18b12599-9af6-4deb-8943-c8048a44a236-config-data-default\") pod \"openstack-galera-1\" (UID: \"18b12599-9af6-4deb-8943-c8048a44a236\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.622109 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18b12599-9af6-4deb-8943-c8048a44a236-operator-scripts\") pod \"openstack-galera-1\" (UID: \"18b12599-9af6-4deb-8943-c8048a44a236\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.622205 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbrz7\" (UniqueName: \"kubernetes.io/projected/4067b873-5563-4077-b020-6464d703ddd4-kube-api-access-gbrz7\") pod \"openstack-galera-0\" (UID: \"4067b873-5563-4077-b020-6464d703ddd4\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.622229 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/65796556-fea5-482e-a4e8-883f027c30ba-config-data-default\") pod \"openstack-galera-2\" (UID: \"65796556-fea5-482e-a4e8-883f027c30ba\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.622247 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s98hq\" (UniqueName: \"kubernetes.io/projected/18b12599-9af6-4deb-8943-c8048a44a236-kube-api-access-s98hq\") pod \"openstack-galera-1\" (UID: \"18b12599-9af6-4deb-8943-c8048a44a236\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.622276 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4067b873-5563-4077-b020-6464d703ddd4-config-data-default\") pod \"openstack-galera-0\" (UID: \"4067b873-5563-4077-b020-6464d703ddd4\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.622296 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4067b873-5563-4077-b020-6464d703ddd4-kolla-config\") pod \"openstack-galera-0\" (UID: \"4067b873-5563-4077-b020-6464d703ddd4\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.622336 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4067b873-5563-4077-b020-6464d703ddd4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4067b873-5563-4077-b020-6464d703ddd4\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.623180 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/65796556-fea5-482e-a4e8-883f027c30ba-kolla-config\") pod \"openstack-galera-2\" (UID: \"65796556-fea5-482e-a4e8-883f027c30ba\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.623466 4704 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-2\" (UID: \"65796556-fea5-482e-a4e8-883f027c30ba\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/openstack-galera-2" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.623655 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4067b873-5563-4077-b020-6464d703ddd4-kolla-config\") pod \"openstack-galera-0\" (UID: \"4067b873-5563-4077-b020-6464d703ddd4\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.624485 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4067b873-5563-4077-b020-6464d703ddd4-config-data-default\") pod \"openstack-galera-0\" (UID: \"4067b873-5563-4077-b020-6464d703ddd4\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.624565 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4067b873-5563-4077-b020-6464d703ddd4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4067b873-5563-4077-b020-6464d703ddd4\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.624690 4704 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"4067b873-5563-4077-b020-6464d703ddd4\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/openstack-galera-0" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.625515 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/65796556-fea5-482e-a4e8-883f027c30ba-config-data-generated\") pod \"openstack-galera-2\" (UID: \"65796556-fea5-482e-a4e8-883f027c30ba\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.627388 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/65796556-fea5-482e-a4e8-883f027c30ba-config-data-default\") pod \"openstack-galera-2\" (UID: \"65796556-fea5-482e-a4e8-883f027c30ba\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.635216 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65796556-fea5-482e-a4e8-883f027c30ba-operator-scripts\") pod \"openstack-galera-2\" (UID: \"65796556-fea5-482e-a4e8-883f027c30ba\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.643010 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsjvd\" (UniqueName: \"kubernetes.io/projected/65796556-fea5-482e-a4e8-883f027c30ba-kube-api-access-gsjvd\") pod \"openstack-galera-2\" (UID: \"65796556-fea5-482e-a4e8-883f027c30ba\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.643375 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbrz7\" (UniqueName: \"kubernetes.io/projected/4067b873-5563-4077-b020-6464d703ddd4-kube-api-access-gbrz7\") pod \"openstack-galera-0\" (UID: \"4067b873-5563-4077-b020-6464d703ddd4\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.647195 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-2\" (UID: \"65796556-fea5-482e-a4e8-883f027c30ba\") " pod="glance-kuttl-tests/openstack-galera-2" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.654533 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"4067b873-5563-4077-b020-6464d703ddd4\") " pod="glance-kuttl-tests/openstack-galera-0" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.723395 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-1\" (UID: \"18b12599-9af6-4deb-8943-c8048a44a236\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.723451 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/18b12599-9af6-4deb-8943-c8048a44a236-config-data-default\") pod \"openstack-galera-1\" (UID: \"18b12599-9af6-4deb-8943-c8048a44a236\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.723471 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18b12599-9af6-4deb-8943-c8048a44a236-operator-scripts\") pod \"openstack-galera-1\" (UID: \"18b12599-9af6-4deb-8943-c8048a44a236\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.723510 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s98hq\" (UniqueName: \"kubernetes.io/projected/18b12599-9af6-4deb-8943-c8048a44a236-kube-api-access-s98hq\") pod \"openstack-galera-1\" (UID: \"18b12599-9af6-4deb-8943-c8048a44a236\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.723547 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/18b12599-9af6-4deb-8943-c8048a44a236-kolla-config\") pod \"openstack-galera-1\" (UID: \"18b12599-9af6-4deb-8943-c8048a44a236\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.723566 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/18b12599-9af6-4deb-8943-c8048a44a236-config-data-generated\") pod \"openstack-galera-1\" (UID: \"18b12599-9af6-4deb-8943-c8048a44a236\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.724106 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/18b12599-9af6-4deb-8943-c8048a44a236-config-data-generated\") pod \"openstack-galera-1\" (UID: \"18b12599-9af6-4deb-8943-c8048a44a236\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.724713 4704 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-1\" (UID: \"18b12599-9af6-4deb-8943-c8048a44a236\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/openstack-galera-1" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.725127 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18b12599-9af6-4deb-8943-c8048a44a236-operator-scripts\") pod \"openstack-galera-1\" (UID: \"18b12599-9af6-4deb-8943-c8048a44a236\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.725609 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/18b12599-9af6-4deb-8943-c8048a44a236-config-data-default\") pod \"openstack-galera-1\" (UID: \"18b12599-9af6-4deb-8943-c8048a44a236\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.725828 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/18b12599-9af6-4deb-8943-c8048a44a236-kolla-config\") pod \"openstack-galera-1\" (UID: \"18b12599-9af6-4deb-8943-c8048a44a236\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.741649 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s98hq\" (UniqueName: \"kubernetes.io/projected/18b12599-9af6-4deb-8943-c8048a44a236-kube-api-access-s98hq\") pod \"openstack-galera-1\" (UID: \"18b12599-9af6-4deb-8943-c8048a44a236\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.743903 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-1\" (UID: \"18b12599-9af6-4deb-8943-c8048a44a236\") " pod="glance-kuttl-tests/openstack-galera-1" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.785332 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.808229 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Nov 25 15:49:58 crc kubenswrapper[4704]: I1125 15:49:58.828888 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Nov 25 15:49:59 crc kubenswrapper[4704]: I1125 15:49:59.118651 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Nov 25 15:49:59 crc kubenswrapper[4704]: I1125 15:49:59.251995 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Nov 25 15:49:59 crc kubenswrapper[4704]: W1125 15:49:59.259466 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65796556_fea5_482e_a4e8_883f027c30ba.slice/crio-1530ba0dab978d91f95a332b8ae42a730cc437b95c1c2d80aecc9905f9a51a56 WatchSource:0}: Error finding container 1530ba0dab978d91f95a332b8ae42a730cc437b95c1c2d80aecc9905f9a51a56: Status 404 returned error can't find the container with id 1530ba0dab978d91f95a332b8ae42a730cc437b95c1c2d80aecc9905f9a51a56 Nov 25 15:49:59 crc kubenswrapper[4704]: I1125 15:49:59.351478 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Nov 25 15:49:59 crc kubenswrapper[4704]: W1125 15:49:59.356599 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18b12599_9af6_4deb_8943_c8048a44a236.slice/crio-061f9b7fac6a9c452f7c27ad9e2a1026f5e165cee61c40b835a68c061791ebd7 WatchSource:0}: Error finding container 061f9b7fac6a9c452f7c27ad9e2a1026f5e165cee61c40b835a68c061791ebd7: Status 404 returned error can't find the container with id 061f9b7fac6a9c452f7c27ad9e2a1026f5e165cee61c40b835a68c061791ebd7 Nov 25 15:49:59 crc kubenswrapper[4704]: I1125 15:49:59.767329 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-67cd7d6948-7k7tj" event={"ID":"8f214e62-ec20-41c1-835c-1daab12028a0","Type":"ContainerStarted","Data":"a1bdf5686b04d5cb21108fd42ff17302f557aec484f66b7bc219ae191c20e548"} Nov 25 15:49:59 crc kubenswrapper[4704]: I1125 15:49:59.768766 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"4067b873-5563-4077-b020-6464d703ddd4","Type":"ContainerStarted","Data":"54d03549cf0a734b9d7f47cb7d7faa39434dcfa933bec207c797d0a3be7bc4d0"} Nov 25 15:49:59 crc kubenswrapper[4704]: I1125 15:49:59.770204 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"18b12599-9af6-4deb-8943-c8048a44a236","Type":"ContainerStarted","Data":"061f9b7fac6a9c452f7c27ad9e2a1026f5e165cee61c40b835a68c061791ebd7"} Nov 25 15:49:59 crc kubenswrapper[4704]: I1125 15:49:59.786768 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"65796556-fea5-482e-a4e8-883f027c30ba","Type":"ContainerStarted","Data":"1530ba0dab978d91f95a332b8ae42a730cc437b95c1c2d80aecc9905f9a51a56"} Nov 25 15:50:02 crc kubenswrapper[4704]: I1125 15:50:02.830096 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-67cd7d6948-7k7tj" event={"ID":"8f214e62-ec20-41c1-835c-1daab12028a0","Type":"ContainerStarted","Data":"7529957709b20ebe4899cc9085ae79891b199c761d71fdcefbcfa53bccc6cea5"} Nov 25 15:50:02 crc kubenswrapper[4704]: I1125 15:50:02.830943 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-67cd7d6948-7k7tj" Nov 25 15:50:02 crc kubenswrapper[4704]: I1125 15:50:02.863490 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-67cd7d6948-7k7tj" podStartSLOduration=2.637529082 podStartE2EDuration="7.863461745s" podCreationTimestamp="2025-11-25 15:49:55 +0000 UTC" firstStartedPulling="2025-11-25 15:49:56.490895742 +0000 UTC m=+882.759169523" lastFinishedPulling="2025-11-25 15:50:01.716828405 +0000 UTC m=+887.985102186" observedRunningTime="2025-11-25 15:50:02.855256858 +0000 UTC m=+889.123530659" watchObservedRunningTime="2025-11-25 15:50:02.863461745 +0000 UTC m=+889.131735546" Nov 25 15:50:03 crc kubenswrapper[4704]: I1125 15:50:03.843951 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-67cd7d6948-7k7tj" Nov 25 15:50:05 crc kubenswrapper[4704]: I1125 15:50:05.153028 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/memcached-0"] Nov 25 15:50:05 crc kubenswrapper[4704]: I1125 15:50:05.154284 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Nov 25 15:50:05 crc kubenswrapper[4704]: I1125 15:50:05.157196 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"memcached-config-data" Nov 25 15:50:05 crc kubenswrapper[4704]: I1125 15:50:05.158233 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"memcached-memcached-dockercfg-ch5n9" Nov 25 15:50:05 crc kubenswrapper[4704]: I1125 15:50:05.166694 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/memcached-0"] Nov 25 15:50:05 crc kubenswrapper[4704]: I1125 15:50:05.258673 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98qqm\" (UniqueName: \"kubernetes.io/projected/6366e7fa-3e60-4828-9571-a04c313af8df-kube-api-access-98qqm\") pod \"memcached-0\" (UID: \"6366e7fa-3e60-4828-9571-a04c313af8df\") " pod="glance-kuttl-tests/memcached-0" Nov 25 15:50:05 crc kubenswrapper[4704]: I1125 15:50:05.258756 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6366e7fa-3e60-4828-9571-a04c313af8df-config-data\") pod \"memcached-0\" (UID: \"6366e7fa-3e60-4828-9571-a04c313af8df\") " pod="glance-kuttl-tests/memcached-0" Nov 25 15:50:05 crc kubenswrapper[4704]: I1125 15:50:05.258841 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6366e7fa-3e60-4828-9571-a04c313af8df-kolla-config\") pod \"memcached-0\" (UID: \"6366e7fa-3e60-4828-9571-a04c313af8df\") " pod="glance-kuttl-tests/memcached-0" Nov 25 15:50:05 crc kubenswrapper[4704]: I1125 15:50:05.360406 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98qqm\" (UniqueName: \"kubernetes.io/projected/6366e7fa-3e60-4828-9571-a04c313af8df-kube-api-access-98qqm\") pod \"memcached-0\" (UID: \"6366e7fa-3e60-4828-9571-a04c313af8df\") " pod="glance-kuttl-tests/memcached-0" Nov 25 15:50:05 crc kubenswrapper[4704]: I1125 15:50:05.360956 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6366e7fa-3e60-4828-9571-a04c313af8df-config-data\") pod \"memcached-0\" (UID: \"6366e7fa-3e60-4828-9571-a04c313af8df\") " pod="glance-kuttl-tests/memcached-0" Nov 25 15:50:05 crc kubenswrapper[4704]: I1125 15:50:05.361011 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6366e7fa-3e60-4828-9571-a04c313af8df-kolla-config\") pod \"memcached-0\" (UID: \"6366e7fa-3e60-4828-9571-a04c313af8df\") " pod="glance-kuttl-tests/memcached-0" Nov 25 15:50:05 crc kubenswrapper[4704]: I1125 15:50:05.362142 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6366e7fa-3e60-4828-9571-a04c313af8df-kolla-config\") pod \"memcached-0\" (UID: \"6366e7fa-3e60-4828-9571-a04c313af8df\") " pod="glance-kuttl-tests/memcached-0" Nov 25 15:50:05 crc kubenswrapper[4704]: I1125 15:50:05.362354 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6366e7fa-3e60-4828-9571-a04c313af8df-config-data\") pod \"memcached-0\" (UID: \"6366e7fa-3e60-4828-9571-a04c313af8df\") " pod="glance-kuttl-tests/memcached-0" Nov 25 15:50:05 crc kubenswrapper[4704]: I1125 15:50:05.380657 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98qqm\" (UniqueName: \"kubernetes.io/projected/6366e7fa-3e60-4828-9571-a04c313af8df-kube-api-access-98qqm\") pod \"memcached-0\" (UID: \"6366e7fa-3e60-4828-9571-a04c313af8df\") " pod="glance-kuttl-tests/memcached-0" Nov 25 15:50:05 crc kubenswrapper[4704]: I1125 15:50:05.473507 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Nov 25 15:50:08 crc kubenswrapper[4704]: I1125 15:50:08.006503 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-99mng"] Nov 25 15:50:08 crc kubenswrapper[4704]: I1125 15:50:08.008099 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-99mng" Nov 25 15:50:08 crc kubenswrapper[4704]: I1125 15:50:08.011927 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-s9k56" Nov 25 15:50:08 crc kubenswrapper[4704]: I1125 15:50:08.018677 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-99mng"] Nov 25 15:50:08 crc kubenswrapper[4704]: I1125 15:50:08.099396 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72gx5\" (UniqueName: \"kubernetes.io/projected/14f98924-1b85-429a-81dd-2d1b3f836464-kube-api-access-72gx5\") pod \"rabbitmq-cluster-operator-index-99mng\" (UID: \"14f98924-1b85-429a-81dd-2d1b3f836464\") " pod="openstack-operators/rabbitmq-cluster-operator-index-99mng" Nov 25 15:50:08 crc kubenswrapper[4704]: I1125 15:50:08.200781 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72gx5\" (UniqueName: \"kubernetes.io/projected/14f98924-1b85-429a-81dd-2d1b3f836464-kube-api-access-72gx5\") pod \"rabbitmq-cluster-operator-index-99mng\" (UID: \"14f98924-1b85-429a-81dd-2d1b3f836464\") " pod="openstack-operators/rabbitmq-cluster-operator-index-99mng" Nov 25 15:50:08 crc kubenswrapper[4704]: I1125 15:50:08.228036 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72gx5\" (UniqueName: \"kubernetes.io/projected/14f98924-1b85-429a-81dd-2d1b3f836464-kube-api-access-72gx5\") pod \"rabbitmq-cluster-operator-index-99mng\" (UID: \"14f98924-1b85-429a-81dd-2d1b3f836464\") " pod="openstack-operators/rabbitmq-cluster-operator-index-99mng" Nov 25 15:50:08 crc kubenswrapper[4704]: I1125 15:50:08.325500 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-99mng" Nov 25 15:50:09 crc kubenswrapper[4704]: I1125 15:50:09.733226 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-99mng"] Nov 25 15:50:09 crc kubenswrapper[4704]: W1125 15:50:09.737271 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14f98924_1b85_429a_81dd_2d1b3f836464.slice/crio-fe1e2e55142ba508f1b750c36992f036d389eae524bd82f6d42224af942c9826 WatchSource:0}: Error finding container fe1e2e55142ba508f1b750c36992f036d389eae524bd82f6d42224af942c9826: Status 404 returned error can't find the container with id fe1e2e55142ba508f1b750c36992f036d389eae524bd82f6d42224af942c9826 Nov 25 15:50:09 crc kubenswrapper[4704]: I1125 15:50:09.843067 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/memcached-0"] Nov 25 15:50:09 crc kubenswrapper[4704]: W1125 15:50:09.844777 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6366e7fa_3e60_4828_9571_a04c313af8df.slice/crio-21ccabfa26dcbb4943273575c5669ddaad353c160f01636a94e4642183e8a113 WatchSource:0}: Error finding container 21ccabfa26dcbb4943273575c5669ddaad353c160f01636a94e4642183e8a113: Status 404 returned error can't find the container with id 21ccabfa26dcbb4943273575c5669ddaad353c160f01636a94e4642183e8a113 Nov 25 15:50:09 crc kubenswrapper[4704]: I1125 15:50:09.886650 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"65796556-fea5-482e-a4e8-883f027c30ba","Type":"ContainerStarted","Data":"d0ddd3df007941e283268b270deb9a2f3e57a97e8fd75832eda3e79d59ba324c"} Nov 25 15:50:09 crc kubenswrapper[4704]: I1125 15:50:09.888877 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"4067b873-5563-4077-b020-6464d703ddd4","Type":"ContainerStarted","Data":"824608136ef9ddcff6e9730113d9d0004f448080b20c1493aaabdb927e6828ba"} Nov 25 15:50:09 crc kubenswrapper[4704]: I1125 15:50:09.890224 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-99mng" event={"ID":"14f98924-1b85-429a-81dd-2d1b3f836464","Type":"ContainerStarted","Data":"fe1e2e55142ba508f1b750c36992f036d389eae524bd82f6d42224af942c9826"} Nov 25 15:50:09 crc kubenswrapper[4704]: I1125 15:50:09.891926 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"18b12599-9af6-4deb-8943-c8048a44a236","Type":"ContainerStarted","Data":"4d8049fcff1219855c0c7882e2687509467c6cff637d0d9d543d18d04bc7e53b"} Nov 25 15:50:09 crc kubenswrapper[4704]: I1125 15:50:09.893605 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"6366e7fa-3e60-4828-9571-a04c313af8df","Type":"ContainerStarted","Data":"21ccabfa26dcbb4943273575c5669ddaad353c160f01636a94e4642183e8a113"} Nov 25 15:50:13 crc kubenswrapper[4704]: I1125 15:50:13.920929 4704 generic.go:334] "Generic (PLEG): container finished" podID="65796556-fea5-482e-a4e8-883f027c30ba" containerID="d0ddd3df007941e283268b270deb9a2f3e57a97e8fd75832eda3e79d59ba324c" exitCode=0 Nov 25 15:50:13 crc kubenswrapper[4704]: I1125 15:50:13.921009 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"65796556-fea5-482e-a4e8-883f027c30ba","Type":"ContainerDied","Data":"d0ddd3df007941e283268b270deb9a2f3e57a97e8fd75832eda3e79d59ba324c"} Nov 25 15:50:13 crc kubenswrapper[4704]: I1125 15:50:13.927223 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-99mng" event={"ID":"14f98924-1b85-429a-81dd-2d1b3f836464","Type":"ContainerStarted","Data":"d8bff3789e4519f9af1ebdb4c6351f6e617167b68d46073f064eb9b1ac0bef49"} Nov 25 15:50:13 crc kubenswrapper[4704]: I1125 15:50:13.930427 4704 generic.go:334] "Generic (PLEG): container finished" podID="4067b873-5563-4077-b020-6464d703ddd4" containerID="824608136ef9ddcff6e9730113d9d0004f448080b20c1493aaabdb927e6828ba" exitCode=0 Nov 25 15:50:13 crc kubenswrapper[4704]: I1125 15:50:13.930494 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"4067b873-5563-4077-b020-6464d703ddd4","Type":"ContainerDied","Data":"824608136ef9ddcff6e9730113d9d0004f448080b20c1493aaabdb927e6828ba"} Nov 25 15:50:13 crc kubenswrapper[4704]: I1125 15:50:13.932324 4704 generic.go:334] "Generic (PLEG): container finished" podID="18b12599-9af6-4deb-8943-c8048a44a236" containerID="4d8049fcff1219855c0c7882e2687509467c6cff637d0d9d543d18d04bc7e53b" exitCode=0 Nov 25 15:50:13 crc kubenswrapper[4704]: I1125 15:50:13.932414 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"18b12599-9af6-4deb-8943-c8048a44a236","Type":"ContainerDied","Data":"4d8049fcff1219855c0c7882e2687509467c6cff637d0d9d543d18d04bc7e53b"} Nov 25 15:50:13 crc kubenswrapper[4704]: I1125 15:50:13.938425 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"6366e7fa-3e60-4828-9571-a04c313af8df","Type":"ContainerStarted","Data":"d1924f25006467c32a11d39a7400bb21defe4f5b57eb363e35a685a85a98a556"} Nov 25 15:50:13 crc kubenswrapper[4704]: I1125 15:50:13.939014 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/memcached-0" Nov 25 15:50:13 crc kubenswrapper[4704]: I1125 15:50:13.967966 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-99mng" podStartSLOduration=3.444596299 podStartE2EDuration="6.96793795s" podCreationTimestamp="2025-11-25 15:50:07 +0000 UTC" firstStartedPulling="2025-11-25 15:50:09.74005193 +0000 UTC m=+896.008325711" lastFinishedPulling="2025-11-25 15:50:13.263393581 +0000 UTC m=+899.531667362" observedRunningTime="2025-11-25 15:50:13.965651244 +0000 UTC m=+900.233925025" watchObservedRunningTime="2025-11-25 15:50:13.96793795 +0000 UTC m=+900.236211731" Nov 25 15:50:14 crc kubenswrapper[4704]: I1125 15:50:14.948343 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"4067b873-5563-4077-b020-6464d703ddd4","Type":"ContainerStarted","Data":"0faa4c5626632f6a2fb0468198f28ed8da57cd55dd4974fbc5d48c5c72e67a9e"} Nov 25 15:50:14 crc kubenswrapper[4704]: I1125 15:50:14.951631 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"18b12599-9af6-4deb-8943-c8048a44a236","Type":"ContainerStarted","Data":"ddebf0cc207357836b14dfdb6ee58ae6cdc951a2dc8486f39e0e61e38c90fe32"} Nov 25 15:50:14 crc kubenswrapper[4704]: I1125 15:50:14.957294 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"65796556-fea5-482e-a4e8-883f027c30ba","Type":"ContainerStarted","Data":"986330f6cc850e1b5fe98631f6882647393e22c0cad112e7f2f2b480e5b324a3"} Nov 25 15:50:14 crc kubenswrapper[4704]: I1125 15:50:14.971971 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/memcached-0" podStartSLOduration=7.752183723 podStartE2EDuration="9.971934566s" podCreationTimestamp="2025-11-25 15:50:05 +0000 UTC" firstStartedPulling="2025-11-25 15:50:09.847721163 +0000 UTC m=+896.115994944" lastFinishedPulling="2025-11-25 15:50:12.067472016 +0000 UTC m=+898.335745787" observedRunningTime="2025-11-25 15:50:14.037548052 +0000 UTC m=+900.305821833" watchObservedRunningTime="2025-11-25 15:50:14.971934566 +0000 UTC m=+901.240208347" Nov 25 15:50:14 crc kubenswrapper[4704]: I1125 15:50:14.974029 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-0" podStartSLOduration=7.826675442 podStartE2EDuration="17.974020066s" podCreationTimestamp="2025-11-25 15:49:57 +0000 UTC" firstStartedPulling="2025-11-25 15:49:59.128862146 +0000 UTC m=+885.397135927" lastFinishedPulling="2025-11-25 15:50:09.27620677 +0000 UTC m=+895.544480551" observedRunningTime="2025-11-25 15:50:14.973985375 +0000 UTC m=+901.242259176" watchObservedRunningTime="2025-11-25 15:50:14.974020066 +0000 UTC m=+901.242293847" Nov 25 15:50:14 crc kubenswrapper[4704]: I1125 15:50:14.997686 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-2" podStartSLOduration=8.013641248 podStartE2EDuration="17.99766392s" podCreationTimestamp="2025-11-25 15:49:57 +0000 UTC" firstStartedPulling="2025-11-25 15:49:59.261996445 +0000 UTC m=+885.530270226" lastFinishedPulling="2025-11-25 15:50:09.246019117 +0000 UTC m=+895.514292898" observedRunningTime="2025-11-25 15:50:14.991922074 +0000 UTC m=+901.260195855" watchObservedRunningTime="2025-11-25 15:50:14.99766392 +0000 UTC m=+901.265937701" Nov 25 15:50:15 crc kubenswrapper[4704]: I1125 15:50:15.014656 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-1" podStartSLOduration=8.029541427 podStartE2EDuration="18.01462921s" podCreationTimestamp="2025-11-25 15:49:57 +0000 UTC" firstStartedPulling="2025-11-25 15:49:59.371436799 +0000 UTC m=+885.639710580" lastFinishedPulling="2025-11-25 15:50:09.356524592 +0000 UTC m=+895.624798363" observedRunningTime="2025-11-25 15:50:15.012467118 +0000 UTC m=+901.280740909" watchObservedRunningTime="2025-11-25 15:50:15.01462921 +0000 UTC m=+901.282902991" Nov 25 15:50:18 crc kubenswrapper[4704]: I1125 15:50:18.325923 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-99mng" Nov 25 15:50:18 crc kubenswrapper[4704]: I1125 15:50:18.326830 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-99mng" Nov 25 15:50:18 crc kubenswrapper[4704]: I1125 15:50:18.354383 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-99mng" Nov 25 15:50:18 crc kubenswrapper[4704]: I1125 15:50:18.791123 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-0" Nov 25 15:50:18 crc kubenswrapper[4704]: I1125 15:50:18.791205 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-0" Nov 25 15:50:18 crc kubenswrapper[4704]: I1125 15:50:18.808648 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-2" Nov 25 15:50:18 crc kubenswrapper[4704]: I1125 15:50:18.808710 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-2" Nov 25 15:50:18 crc kubenswrapper[4704]: I1125 15:50:18.829559 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-1" Nov 25 15:50:18 crc kubenswrapper[4704]: I1125 15:50:18.829642 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-1" Nov 25 15:50:19 crc kubenswrapper[4704]: I1125 15:50:19.006207 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-99mng" Nov 25 15:50:20 crc kubenswrapper[4704]: I1125 15:50:20.475394 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/memcached-0" Nov 25 15:50:20 crc kubenswrapper[4704]: I1125 15:50:20.979473 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-2" Nov 25 15:50:21 crc kubenswrapper[4704]: I1125 15:50:21.044678 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-2" Nov 25 15:50:27 crc kubenswrapper[4704]: I1125 15:50:27.058727 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs"] Nov 25 15:50:27 crc kubenswrapper[4704]: I1125 15:50:27.061758 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs" Nov 25 15:50:27 crc kubenswrapper[4704]: I1125 15:50:27.064668 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-8zdtm" Nov 25 15:50:27 crc kubenswrapper[4704]: I1125 15:50:27.079051 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs"] Nov 25 15:50:27 crc kubenswrapper[4704]: I1125 15:50:27.085287 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6f65fa5-cbd1-45a8-8b39-0255370b20c4-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs\" (UID: \"c6f65fa5-cbd1-45a8-8b39-0255370b20c4\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs" Nov 25 15:50:27 crc kubenswrapper[4704]: I1125 15:50:27.085342 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6f65fa5-cbd1-45a8-8b39-0255370b20c4-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs\" (UID: \"c6f65fa5-cbd1-45a8-8b39-0255370b20c4\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs" Nov 25 15:50:27 crc kubenswrapper[4704]: I1125 15:50:27.085390 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rhqm\" (UniqueName: \"kubernetes.io/projected/c6f65fa5-cbd1-45a8-8b39-0255370b20c4-kube-api-access-7rhqm\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs\" (UID: \"c6f65fa5-cbd1-45a8-8b39-0255370b20c4\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs" Nov 25 15:50:27 crc kubenswrapper[4704]: I1125 15:50:27.185980 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6f65fa5-cbd1-45a8-8b39-0255370b20c4-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs\" (UID: \"c6f65fa5-cbd1-45a8-8b39-0255370b20c4\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs" Nov 25 15:50:27 crc kubenswrapper[4704]: I1125 15:50:27.186033 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6f65fa5-cbd1-45a8-8b39-0255370b20c4-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs\" (UID: \"c6f65fa5-cbd1-45a8-8b39-0255370b20c4\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs" Nov 25 15:50:27 crc kubenswrapper[4704]: I1125 15:50:27.186079 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rhqm\" (UniqueName: \"kubernetes.io/projected/c6f65fa5-cbd1-45a8-8b39-0255370b20c4-kube-api-access-7rhqm\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs\" (UID: \"c6f65fa5-cbd1-45a8-8b39-0255370b20c4\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs" Nov 25 15:50:27 crc kubenswrapper[4704]: I1125 15:50:27.186719 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6f65fa5-cbd1-45a8-8b39-0255370b20c4-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs\" (UID: \"c6f65fa5-cbd1-45a8-8b39-0255370b20c4\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs" Nov 25 15:50:27 crc kubenswrapper[4704]: I1125 15:50:27.186852 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6f65fa5-cbd1-45a8-8b39-0255370b20c4-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs\" (UID: \"c6f65fa5-cbd1-45a8-8b39-0255370b20c4\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs" Nov 25 15:50:27 crc kubenswrapper[4704]: I1125 15:50:27.211811 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rhqm\" (UniqueName: \"kubernetes.io/projected/c6f65fa5-cbd1-45a8-8b39-0255370b20c4-kube-api-access-7rhqm\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs\" (UID: \"c6f65fa5-cbd1-45a8-8b39-0255370b20c4\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs" Nov 25 15:50:27 crc kubenswrapper[4704]: I1125 15:50:27.385340 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs" Nov 25 15:50:27 crc kubenswrapper[4704]: I1125 15:50:27.795648 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs"] Nov 25 15:50:27 crc kubenswrapper[4704]: W1125 15:50:27.802563 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6f65fa5_cbd1_45a8_8b39_0255370b20c4.slice/crio-6b18e807e0a5cf8ddd75f26835a7f86c6b015124c849e2536df1a9d90f5379a8 WatchSource:0}: Error finding container 6b18e807e0a5cf8ddd75f26835a7f86c6b015124c849e2536df1a9d90f5379a8: Status 404 returned error can't find the container with id 6b18e807e0a5cf8ddd75f26835a7f86c6b015124c849e2536df1a9d90f5379a8 Nov 25 15:50:28 crc kubenswrapper[4704]: I1125 15:50:28.036068 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs" event={"ID":"c6f65fa5-cbd1-45a8-8b39-0255370b20c4","Type":"ContainerStarted","Data":"0dcfe794382eb76c1446f21f70f462dbc61563be3242425c1bbabd0fe0892b4c"} Nov 25 15:50:28 crc kubenswrapper[4704]: I1125 15:50:28.036690 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs" event={"ID":"c6f65fa5-cbd1-45a8-8b39-0255370b20c4","Type":"ContainerStarted","Data":"6b18e807e0a5cf8ddd75f26835a7f86c6b015124c849e2536df1a9d90f5379a8"} Nov 25 15:50:28 crc kubenswrapper[4704]: I1125 15:50:28.899667 4704 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/openstack-galera-2" podUID="65796556-fea5-482e-a4e8-883f027c30ba" containerName="galera" probeResult="failure" output=< Nov 25 15:50:28 crc kubenswrapper[4704]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Nov 25 15:50:28 crc kubenswrapper[4704]: > Nov 25 15:50:29 crc kubenswrapper[4704]: I1125 15:50:29.043749 4704 generic.go:334] "Generic (PLEG): container finished" podID="c6f65fa5-cbd1-45a8-8b39-0255370b20c4" containerID="0dcfe794382eb76c1446f21f70f462dbc61563be3242425c1bbabd0fe0892b4c" exitCode=0 Nov 25 15:50:29 crc kubenswrapper[4704]: I1125 15:50:29.043828 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs" event={"ID":"c6f65fa5-cbd1-45a8-8b39-0255370b20c4","Type":"ContainerDied","Data":"0dcfe794382eb76c1446f21f70f462dbc61563be3242425c1bbabd0fe0892b4c"} Nov 25 15:50:31 crc kubenswrapper[4704]: I1125 15:50:31.063980 4704 generic.go:334] "Generic (PLEG): container finished" podID="c6f65fa5-cbd1-45a8-8b39-0255370b20c4" containerID="12bdf3487f0dce1bf452bc468b6e1ab44072a8de2a03e82116688cca7cee45b3" exitCode=0 Nov 25 15:50:31 crc kubenswrapper[4704]: I1125 15:50:31.064091 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs" event={"ID":"c6f65fa5-cbd1-45a8-8b39-0255370b20c4","Type":"ContainerDied","Data":"12bdf3487f0dce1bf452bc468b6e1ab44072a8de2a03e82116688cca7cee45b3"} Nov 25 15:50:32 crc kubenswrapper[4704]: I1125 15:50:32.074332 4704 generic.go:334] "Generic (PLEG): container finished" podID="c6f65fa5-cbd1-45a8-8b39-0255370b20c4" containerID="7bd5d37d7dfa03e86ed6dd39687bda3ea3c47373f6f1c172b0dc503399460504" exitCode=0 Nov 25 15:50:32 crc kubenswrapper[4704]: I1125 15:50:32.074421 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs" event={"ID":"c6f65fa5-cbd1-45a8-8b39-0255370b20c4","Type":"ContainerDied","Data":"7bd5d37d7dfa03e86ed6dd39687bda3ea3c47373f6f1c172b0dc503399460504"} Nov 25 15:50:33 crc kubenswrapper[4704]: I1125 15:50:33.346187 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs" Nov 25 15:50:33 crc kubenswrapper[4704]: I1125 15:50:33.481046 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6f65fa5-cbd1-45a8-8b39-0255370b20c4-util\") pod \"c6f65fa5-cbd1-45a8-8b39-0255370b20c4\" (UID: \"c6f65fa5-cbd1-45a8-8b39-0255370b20c4\") " Nov 25 15:50:33 crc kubenswrapper[4704]: I1125 15:50:33.481937 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rhqm\" (UniqueName: \"kubernetes.io/projected/c6f65fa5-cbd1-45a8-8b39-0255370b20c4-kube-api-access-7rhqm\") pod \"c6f65fa5-cbd1-45a8-8b39-0255370b20c4\" (UID: \"c6f65fa5-cbd1-45a8-8b39-0255370b20c4\") " Nov 25 15:50:33 crc kubenswrapper[4704]: I1125 15:50:33.482000 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6f65fa5-cbd1-45a8-8b39-0255370b20c4-bundle\") pod \"c6f65fa5-cbd1-45a8-8b39-0255370b20c4\" (UID: \"c6f65fa5-cbd1-45a8-8b39-0255370b20c4\") " Nov 25 15:50:33 crc kubenswrapper[4704]: I1125 15:50:33.484977 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6f65fa5-cbd1-45a8-8b39-0255370b20c4-bundle" (OuterVolumeSpecName: "bundle") pod "c6f65fa5-cbd1-45a8-8b39-0255370b20c4" (UID: "c6f65fa5-cbd1-45a8-8b39-0255370b20c4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:50:33 crc kubenswrapper[4704]: I1125 15:50:33.489906 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6f65fa5-cbd1-45a8-8b39-0255370b20c4-kube-api-access-7rhqm" (OuterVolumeSpecName: "kube-api-access-7rhqm") pod "c6f65fa5-cbd1-45a8-8b39-0255370b20c4" (UID: "c6f65fa5-cbd1-45a8-8b39-0255370b20c4"). InnerVolumeSpecName "kube-api-access-7rhqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:50:33 crc kubenswrapper[4704]: I1125 15:50:33.493738 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6f65fa5-cbd1-45a8-8b39-0255370b20c4-util" (OuterVolumeSpecName: "util") pod "c6f65fa5-cbd1-45a8-8b39-0255370b20c4" (UID: "c6f65fa5-cbd1-45a8-8b39-0255370b20c4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:50:33 crc kubenswrapper[4704]: I1125 15:50:33.583348 4704 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6f65fa5-cbd1-45a8-8b39-0255370b20c4-util\") on node \"crc\" DevicePath \"\"" Nov 25 15:50:33 crc kubenswrapper[4704]: I1125 15:50:33.583394 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rhqm\" (UniqueName: \"kubernetes.io/projected/c6f65fa5-cbd1-45a8-8b39-0255370b20c4-kube-api-access-7rhqm\") on node \"crc\" DevicePath \"\"" Nov 25 15:50:33 crc kubenswrapper[4704]: I1125 15:50:33.583405 4704 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6f65fa5-cbd1-45a8-8b39-0255370b20c4-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:50:34 crc kubenswrapper[4704]: I1125 15:50:34.089052 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs" event={"ID":"c6f65fa5-cbd1-45a8-8b39-0255370b20c4","Type":"ContainerDied","Data":"6b18e807e0a5cf8ddd75f26835a7f86c6b015124c849e2536df1a9d90f5379a8"} Nov 25 15:50:34 crc kubenswrapper[4704]: I1125 15:50:34.089347 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b18e807e0a5cf8ddd75f26835a7f86c6b015124c849e2536df1a9d90f5379a8" Nov 25 15:50:34 crc kubenswrapper[4704]: I1125 15:50:34.089153 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs" Nov 25 15:50:34 crc kubenswrapper[4704]: I1125 15:50:34.866611 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-0" Nov 25 15:50:34 crc kubenswrapper[4704]: I1125 15:50:34.937682 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-0" Nov 25 15:50:35 crc kubenswrapper[4704]: I1125 15:50:35.792435 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-1" Nov 25 15:50:35 crc kubenswrapper[4704]: I1125 15:50:35.852643 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-1" Nov 25 15:50:41 crc kubenswrapper[4704]: I1125 15:50:41.264341 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-848dk"] Nov 25 15:50:41 crc kubenswrapper[4704]: E1125 15:50:41.265208 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6f65fa5-cbd1-45a8-8b39-0255370b20c4" containerName="extract" Nov 25 15:50:41 crc kubenswrapper[4704]: I1125 15:50:41.265225 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6f65fa5-cbd1-45a8-8b39-0255370b20c4" containerName="extract" Nov 25 15:50:41 crc kubenswrapper[4704]: E1125 15:50:41.265247 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6f65fa5-cbd1-45a8-8b39-0255370b20c4" containerName="pull" Nov 25 15:50:41 crc kubenswrapper[4704]: I1125 15:50:41.265254 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6f65fa5-cbd1-45a8-8b39-0255370b20c4" containerName="pull" Nov 25 15:50:41 crc kubenswrapper[4704]: E1125 15:50:41.265267 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6f65fa5-cbd1-45a8-8b39-0255370b20c4" containerName="util" Nov 25 15:50:41 crc kubenswrapper[4704]: I1125 15:50:41.265273 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6f65fa5-cbd1-45a8-8b39-0255370b20c4" containerName="util" Nov 25 15:50:41 crc kubenswrapper[4704]: I1125 15:50:41.265403 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6f65fa5-cbd1-45a8-8b39-0255370b20c4" containerName="extract" Nov 25 15:50:41 crc kubenswrapper[4704]: I1125 15:50:41.265999 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-848dk" Nov 25 15:50:41 crc kubenswrapper[4704]: I1125 15:50:41.268362 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-kxxr5" Nov 25 15:50:41 crc kubenswrapper[4704]: I1125 15:50:41.280061 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-848dk"] Nov 25 15:50:41 crc kubenswrapper[4704]: I1125 15:50:41.325385 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz88w\" (UniqueName: \"kubernetes.io/projected/e8a33191-6af5-44c1-8f3f-74c8e186a7e3-kube-api-access-hz88w\") pod \"rabbitmq-cluster-operator-779fc9694b-848dk\" (UID: \"e8a33191-6af5-44c1-8f3f-74c8e186a7e3\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-848dk" Nov 25 15:50:41 crc kubenswrapper[4704]: I1125 15:50:41.426469 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz88w\" (UniqueName: \"kubernetes.io/projected/e8a33191-6af5-44c1-8f3f-74c8e186a7e3-kube-api-access-hz88w\") pod \"rabbitmq-cluster-operator-779fc9694b-848dk\" (UID: \"e8a33191-6af5-44c1-8f3f-74c8e186a7e3\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-848dk" Nov 25 15:50:41 crc kubenswrapper[4704]: I1125 15:50:41.445799 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz88w\" (UniqueName: \"kubernetes.io/projected/e8a33191-6af5-44c1-8f3f-74c8e186a7e3-kube-api-access-hz88w\") pod \"rabbitmq-cluster-operator-779fc9694b-848dk\" (UID: \"e8a33191-6af5-44c1-8f3f-74c8e186a7e3\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-848dk" Nov 25 15:50:41 crc kubenswrapper[4704]: I1125 15:50:41.632194 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-848dk" Nov 25 15:50:42 crc kubenswrapper[4704]: I1125 15:50:42.115338 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-848dk"] Nov 25 15:50:42 crc kubenswrapper[4704]: I1125 15:50:42.137724 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-848dk" event={"ID":"e8a33191-6af5-44c1-8f3f-74c8e186a7e3","Type":"ContainerStarted","Data":"9d3f51486219ca0ade4270431c211d32e678273b0591cd08e4e4f0e9584d9c6f"} Nov 25 15:50:46 crc kubenswrapper[4704]: I1125 15:50:46.163297 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-848dk" event={"ID":"e8a33191-6af5-44c1-8f3f-74c8e186a7e3","Type":"ContainerStarted","Data":"58411a04385307a562833bc139988208a350bdf88dd75a01a3f171a04b2ca489"} Nov 25 15:50:46 crc kubenswrapper[4704]: I1125 15:50:46.180384 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-848dk" podStartSLOduration=1.480678154 podStartE2EDuration="5.180362763s" podCreationTimestamp="2025-11-25 15:50:41 +0000 UTC" firstStartedPulling="2025-11-25 15:50:42.128126922 +0000 UTC m=+928.396400703" lastFinishedPulling="2025-11-25 15:50:45.827811531 +0000 UTC m=+932.096085312" observedRunningTime="2025-11-25 15:50:46.176680227 +0000 UTC m=+932.444954018" watchObservedRunningTime="2025-11-25 15:50:46.180362763 +0000 UTC m=+932.448636544" Nov 25 15:50:47 crc kubenswrapper[4704]: I1125 15:50:47.924184 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Nov 25 15:50:47 crc kubenswrapper[4704]: I1125 15:50:47.925214 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Nov 25 15:50:47 crc kubenswrapper[4704]: I1125 15:50:47.927097 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-server-dockercfg-swqdn" Nov 25 15:50:47 crc kubenswrapper[4704]: I1125 15:50:47.927657 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"rabbitmq-server-conf" Nov 25 15:50:47 crc kubenswrapper[4704]: I1125 15:50:47.928379 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"rabbitmq-plugins-conf" Nov 25 15:50:47 crc kubenswrapper[4704]: I1125 15:50:47.928481 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-erlang-cookie" Nov 25 15:50:47 crc kubenswrapper[4704]: I1125 15:50:47.931117 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-default-user" Nov 25 15:50:47 crc kubenswrapper[4704]: I1125 15:50:47.939547 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Nov 25 15:50:48 crc kubenswrapper[4704]: I1125 15:50:48.110565 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 25 15:50:48 crc kubenswrapper[4704]: I1125 15:50:48.110626 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 25 15:50:48 crc kubenswrapper[4704]: I1125 15:50:48.110711 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 25 15:50:48 crc kubenswrapper[4704]: I1125 15:50:48.110753 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwv4t\" (UniqueName: \"kubernetes.io/projected/bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a-kube-api-access-mwv4t\") pod \"rabbitmq-server-0\" (UID: \"bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 25 15:50:48 crc kubenswrapper[4704]: I1125 15:50:48.110847 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 25 15:50:48 crc kubenswrapper[4704]: I1125 15:50:48.110903 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 25 15:50:48 crc kubenswrapper[4704]: I1125 15:50:48.110931 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 25 15:50:48 crc kubenswrapper[4704]: I1125 15:50:48.110985 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-908fed90-d3a5-4b51-97f8-761cb5b69eb6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-908fed90-d3a5-4b51-97f8-761cb5b69eb6\") pod \"rabbitmq-server-0\" (UID: \"bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 25 15:50:48 crc kubenswrapper[4704]: I1125 15:50:48.212743 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 25 15:50:48 crc kubenswrapper[4704]: I1125 15:50:48.212855 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 25 15:50:48 crc kubenswrapper[4704]: I1125 15:50:48.212888 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 25 15:50:48 crc kubenswrapper[4704]: I1125 15:50:48.212926 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwv4t\" (UniqueName: \"kubernetes.io/projected/bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a-kube-api-access-mwv4t\") pod \"rabbitmq-server-0\" (UID: \"bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 25 15:50:48 crc kubenswrapper[4704]: I1125 15:50:48.212991 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 25 15:50:48 crc kubenswrapper[4704]: I1125 15:50:48.213065 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 25 15:50:48 crc kubenswrapper[4704]: I1125 15:50:48.213091 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 25 15:50:48 crc kubenswrapper[4704]: I1125 15:50:48.213129 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-908fed90-d3a5-4b51-97f8-761cb5b69eb6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-908fed90-d3a5-4b51-97f8-761cb5b69eb6\") pod \"rabbitmq-server-0\" (UID: \"bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 25 15:50:48 crc kubenswrapper[4704]: I1125 15:50:48.215836 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 25 15:50:48 crc kubenswrapper[4704]: I1125 15:50:48.216912 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 25 15:50:48 crc kubenswrapper[4704]: I1125 15:50:48.216936 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 25 15:50:48 crc kubenswrapper[4704]: I1125 15:50:48.218471 4704 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 25 15:50:48 crc kubenswrapper[4704]: I1125 15:50:48.218504 4704 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-908fed90-d3a5-4b51-97f8-761cb5b69eb6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-908fed90-d3a5-4b51-97f8-761cb5b69eb6\") pod \"rabbitmq-server-0\" (UID: \"bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/407a3b8d1e8b1285b175b0c4a201909fad36e967d6bf44e7e60ad1a06837da92/globalmount\"" pod="glance-kuttl-tests/rabbitmq-server-0" Nov 25 15:50:48 crc kubenswrapper[4704]: I1125 15:50:48.222294 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 25 15:50:48 crc kubenswrapper[4704]: I1125 15:50:48.225700 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 25 15:50:48 crc kubenswrapper[4704]: I1125 15:50:48.231109 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 25 15:50:48 crc kubenswrapper[4704]: I1125 15:50:48.236185 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwv4t\" (UniqueName: \"kubernetes.io/projected/bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a-kube-api-access-mwv4t\") pod \"rabbitmq-server-0\" (UID: \"bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 25 15:50:48 crc kubenswrapper[4704]: I1125 15:50:48.246386 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-908fed90-d3a5-4b51-97f8-761cb5b69eb6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-908fed90-d3a5-4b51-97f8-761cb5b69eb6\") pod \"rabbitmq-server-0\" (UID: \"bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a\") " pod="glance-kuttl-tests/rabbitmq-server-0" Nov 25 15:50:48 crc kubenswrapper[4704]: I1125 15:50:48.545475 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Nov 25 15:50:48 crc kubenswrapper[4704]: I1125 15:50:48.878410 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Nov 25 15:50:49 crc kubenswrapper[4704]: I1125 15:50:49.182860 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a","Type":"ContainerStarted","Data":"3355d5e65dc0db59843b823a0287085ca6c5f898e9686e1c1313fbca1909f522"} Nov 25 15:50:57 crc kubenswrapper[4704]: I1125 15:50:57.245007 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a","Type":"ContainerStarted","Data":"7014b49fedf2a160b4c1ebcaa23210df82f1dab31bf65138faee01c49e17aa83"} Nov 25 15:50:58 crc kubenswrapper[4704]: I1125 15:50:58.692529 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-mdz8p"] Nov 25 15:50:58 crc kubenswrapper[4704]: I1125 15:50:58.693348 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-mdz8p" Nov 25 15:50:58 crc kubenswrapper[4704]: I1125 15:50:58.695157 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-tqscs" Nov 25 15:50:58 crc kubenswrapper[4704]: I1125 15:50:58.700192 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-mdz8p"] Nov 25 15:50:58 crc kubenswrapper[4704]: I1125 15:50:58.859315 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bddh\" (UniqueName: \"kubernetes.io/projected/17f6718f-4687-4f52-827a-479e1af368ed-kube-api-access-2bddh\") pod \"keystone-operator-index-mdz8p\" (UID: \"17f6718f-4687-4f52-827a-479e1af368ed\") " pod="openstack-operators/keystone-operator-index-mdz8p" Nov 25 15:50:58 crc kubenswrapper[4704]: I1125 15:50:58.961454 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bddh\" (UniqueName: \"kubernetes.io/projected/17f6718f-4687-4f52-827a-479e1af368ed-kube-api-access-2bddh\") pod \"keystone-operator-index-mdz8p\" (UID: \"17f6718f-4687-4f52-827a-479e1af368ed\") " pod="openstack-operators/keystone-operator-index-mdz8p" Nov 25 15:50:58 crc kubenswrapper[4704]: I1125 15:50:58.982394 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bddh\" (UniqueName: \"kubernetes.io/projected/17f6718f-4687-4f52-827a-479e1af368ed-kube-api-access-2bddh\") pod \"keystone-operator-index-mdz8p\" (UID: \"17f6718f-4687-4f52-827a-479e1af368ed\") " pod="openstack-operators/keystone-operator-index-mdz8p" Nov 25 15:50:59 crc kubenswrapper[4704]: I1125 15:50:59.018152 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-mdz8p" Nov 25 15:50:59 crc kubenswrapper[4704]: I1125 15:50:59.442510 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-mdz8p"] Nov 25 15:51:00 crc kubenswrapper[4704]: I1125 15:51:00.270109 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-mdz8p" event={"ID":"17f6718f-4687-4f52-827a-479e1af368ed","Type":"ContainerStarted","Data":"2234fdb130416e026a57410a06d62c09b3e9720ced90c1973a7a5495acd6ea3d"} Nov 25 15:51:01 crc kubenswrapper[4704]: I1125 15:51:01.278747 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-mdz8p" event={"ID":"17f6718f-4687-4f52-827a-479e1af368ed","Type":"ContainerStarted","Data":"bdb504794317230c583f615cb20a3c941cc6f7f44c959432b9585cb492eb777a"} Nov 25 15:51:01 crc kubenswrapper[4704]: I1125 15:51:01.298313 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-mdz8p" podStartSLOduration=2.526130607 podStartE2EDuration="3.29829494s" podCreationTimestamp="2025-11-25 15:50:58 +0000 UTC" firstStartedPulling="2025-11-25 15:50:59.452725374 +0000 UTC m=+945.720999145" lastFinishedPulling="2025-11-25 15:51:00.224889697 +0000 UTC m=+946.493163478" observedRunningTime="2025-11-25 15:51:01.298075894 +0000 UTC m=+947.566349685" watchObservedRunningTime="2025-11-25 15:51:01.29829494 +0000 UTC m=+947.566568721" Nov 25 15:51:09 crc kubenswrapper[4704]: I1125 15:51:09.018626 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-mdz8p" Nov 25 15:51:09 crc kubenswrapper[4704]: I1125 15:51:09.019504 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-mdz8p" Nov 25 15:51:09 crc kubenswrapper[4704]: I1125 15:51:09.044466 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-mdz8p" Nov 25 15:51:09 crc kubenswrapper[4704]: I1125 15:51:09.371862 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-mdz8p" Nov 25 15:51:13 crc kubenswrapper[4704]: I1125 15:51:13.438214 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt"] Nov 25 15:51:13 crc kubenswrapper[4704]: I1125 15:51:13.439863 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt" Nov 25 15:51:13 crc kubenswrapper[4704]: I1125 15:51:13.446699 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt"] Nov 25 15:51:13 crc kubenswrapper[4704]: I1125 15:51:13.446953 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-8zdtm" Nov 25 15:51:13 crc kubenswrapper[4704]: I1125 15:51:13.555770 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f032655-f673-4d19-a90f-67d2e4cbc198-bundle\") pod \"d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt\" (UID: \"1f032655-f673-4d19-a90f-67d2e4cbc198\") " pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt" Nov 25 15:51:13 crc kubenswrapper[4704]: I1125 15:51:13.555858 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f032655-f673-4d19-a90f-67d2e4cbc198-util\") pod \"d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt\" (UID: \"1f032655-f673-4d19-a90f-67d2e4cbc198\") " pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt" Nov 25 15:51:13 crc kubenswrapper[4704]: I1125 15:51:13.555886 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h22r9\" (UniqueName: \"kubernetes.io/projected/1f032655-f673-4d19-a90f-67d2e4cbc198-kube-api-access-h22r9\") pod \"d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt\" (UID: \"1f032655-f673-4d19-a90f-67d2e4cbc198\") " pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt" Nov 25 15:51:13 crc kubenswrapper[4704]: I1125 15:51:13.657293 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f032655-f673-4d19-a90f-67d2e4cbc198-bundle\") pod \"d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt\" (UID: \"1f032655-f673-4d19-a90f-67d2e4cbc198\") " pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt" Nov 25 15:51:13 crc kubenswrapper[4704]: I1125 15:51:13.657696 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f032655-f673-4d19-a90f-67d2e4cbc198-util\") pod \"d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt\" (UID: \"1f032655-f673-4d19-a90f-67d2e4cbc198\") " pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt" Nov 25 15:51:13 crc kubenswrapper[4704]: I1125 15:51:13.657878 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h22r9\" (UniqueName: \"kubernetes.io/projected/1f032655-f673-4d19-a90f-67d2e4cbc198-kube-api-access-h22r9\") pod \"d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt\" (UID: \"1f032655-f673-4d19-a90f-67d2e4cbc198\") " pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt" Nov 25 15:51:13 crc kubenswrapper[4704]: I1125 15:51:13.658019 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f032655-f673-4d19-a90f-67d2e4cbc198-bundle\") pod \"d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt\" (UID: \"1f032655-f673-4d19-a90f-67d2e4cbc198\") " pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt" Nov 25 15:51:13 crc kubenswrapper[4704]: I1125 15:51:13.658319 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f032655-f673-4d19-a90f-67d2e4cbc198-util\") pod \"d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt\" (UID: \"1f032655-f673-4d19-a90f-67d2e4cbc198\") " pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt" Nov 25 15:51:13 crc kubenswrapper[4704]: I1125 15:51:13.678754 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h22r9\" (UniqueName: \"kubernetes.io/projected/1f032655-f673-4d19-a90f-67d2e4cbc198-kube-api-access-h22r9\") pod \"d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt\" (UID: \"1f032655-f673-4d19-a90f-67d2e4cbc198\") " pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt" Nov 25 15:51:13 crc kubenswrapper[4704]: I1125 15:51:13.758267 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt" Nov 25 15:51:13 crc kubenswrapper[4704]: I1125 15:51:13.970859 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt"] Nov 25 15:51:13 crc kubenswrapper[4704]: W1125 15:51:13.976369 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f032655_f673_4d19_a90f_67d2e4cbc198.slice/crio-b42b19842f3307800f73f5bbef0ef1369c9cc6f1a2769b4d351d3021c6c39f2a WatchSource:0}: Error finding container b42b19842f3307800f73f5bbef0ef1369c9cc6f1a2769b4d351d3021c6c39f2a: Status 404 returned error can't find the container with id b42b19842f3307800f73f5bbef0ef1369c9cc6f1a2769b4d351d3021c6c39f2a Nov 25 15:51:14 crc kubenswrapper[4704]: I1125 15:51:14.373705 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt" event={"ID":"1f032655-f673-4d19-a90f-67d2e4cbc198","Type":"ContainerStarted","Data":"b42b19842f3307800f73f5bbef0ef1369c9cc6f1a2769b4d351d3021c6c39f2a"} Nov 25 15:51:15 crc kubenswrapper[4704]: I1125 15:51:15.381001 4704 generic.go:334] "Generic (PLEG): container finished" podID="1f032655-f673-4d19-a90f-67d2e4cbc198" containerID="03581bcc455d81e7fe9442277112905c7e3a3c69fcd61831ae26b56c99e03110" exitCode=0 Nov 25 15:51:15 crc kubenswrapper[4704]: I1125 15:51:15.381103 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt" event={"ID":"1f032655-f673-4d19-a90f-67d2e4cbc198","Type":"ContainerDied","Data":"03581bcc455d81e7fe9442277112905c7e3a3c69fcd61831ae26b56c99e03110"} Nov 25 15:51:16 crc kubenswrapper[4704]: I1125 15:51:16.397907 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt" event={"ID":"1f032655-f673-4d19-a90f-67d2e4cbc198","Type":"ContainerStarted","Data":"abcd12ede4ff90d798a02c789ac177875c0f053989cbfcb996eb18571ac97545"} Nov 25 15:51:17 crc kubenswrapper[4704]: I1125 15:51:17.407724 4704 generic.go:334] "Generic (PLEG): container finished" podID="1f032655-f673-4d19-a90f-67d2e4cbc198" containerID="abcd12ede4ff90d798a02c789ac177875c0f053989cbfcb996eb18571ac97545" exitCode=0 Nov 25 15:51:17 crc kubenswrapper[4704]: I1125 15:51:17.407773 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt" event={"ID":"1f032655-f673-4d19-a90f-67d2e4cbc198","Type":"ContainerDied","Data":"abcd12ede4ff90d798a02c789ac177875c0f053989cbfcb996eb18571ac97545"} Nov 25 15:51:18 crc kubenswrapper[4704]: I1125 15:51:18.417670 4704 generic.go:334] "Generic (PLEG): container finished" podID="1f032655-f673-4d19-a90f-67d2e4cbc198" containerID="ba91e36d5df1bbb647aa2015f2c7bd89b929311e5859590e2737010f7e7c257f" exitCode=0 Nov 25 15:51:18 crc kubenswrapper[4704]: I1125 15:51:18.422939 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt" event={"ID":"1f032655-f673-4d19-a90f-67d2e4cbc198","Type":"ContainerDied","Data":"ba91e36d5df1bbb647aa2015f2c7bd89b929311e5859590e2737010f7e7c257f"} Nov 25 15:51:19 crc kubenswrapper[4704]: I1125 15:51:19.663175 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt" Nov 25 15:51:19 crc kubenswrapper[4704]: I1125 15:51:19.784879 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f032655-f673-4d19-a90f-67d2e4cbc198-bundle\") pod \"1f032655-f673-4d19-a90f-67d2e4cbc198\" (UID: \"1f032655-f673-4d19-a90f-67d2e4cbc198\") " Nov 25 15:51:19 crc kubenswrapper[4704]: I1125 15:51:19.784970 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f032655-f673-4d19-a90f-67d2e4cbc198-util\") pod \"1f032655-f673-4d19-a90f-67d2e4cbc198\" (UID: \"1f032655-f673-4d19-a90f-67d2e4cbc198\") " Nov 25 15:51:19 crc kubenswrapper[4704]: I1125 15:51:19.785027 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h22r9\" (UniqueName: \"kubernetes.io/projected/1f032655-f673-4d19-a90f-67d2e4cbc198-kube-api-access-h22r9\") pod \"1f032655-f673-4d19-a90f-67d2e4cbc198\" (UID: \"1f032655-f673-4d19-a90f-67d2e4cbc198\") " Nov 25 15:51:19 crc kubenswrapper[4704]: I1125 15:51:19.786736 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f032655-f673-4d19-a90f-67d2e4cbc198-bundle" (OuterVolumeSpecName: "bundle") pod "1f032655-f673-4d19-a90f-67d2e4cbc198" (UID: "1f032655-f673-4d19-a90f-67d2e4cbc198"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:51:19 crc kubenswrapper[4704]: I1125 15:51:19.791863 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f032655-f673-4d19-a90f-67d2e4cbc198-kube-api-access-h22r9" (OuterVolumeSpecName: "kube-api-access-h22r9") pod "1f032655-f673-4d19-a90f-67d2e4cbc198" (UID: "1f032655-f673-4d19-a90f-67d2e4cbc198"). InnerVolumeSpecName "kube-api-access-h22r9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:51:19 crc kubenswrapper[4704]: I1125 15:51:19.800938 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f032655-f673-4d19-a90f-67d2e4cbc198-util" (OuterVolumeSpecName: "util") pod "1f032655-f673-4d19-a90f-67d2e4cbc198" (UID: "1f032655-f673-4d19-a90f-67d2e4cbc198"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:51:19 crc kubenswrapper[4704]: I1125 15:51:19.886975 4704 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f032655-f673-4d19-a90f-67d2e4cbc198-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:51:19 crc kubenswrapper[4704]: I1125 15:51:19.887061 4704 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f032655-f673-4d19-a90f-67d2e4cbc198-util\") on node \"crc\" DevicePath \"\"" Nov 25 15:51:19 crc kubenswrapper[4704]: I1125 15:51:19.887080 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h22r9\" (UniqueName: \"kubernetes.io/projected/1f032655-f673-4d19-a90f-67d2e4cbc198-kube-api-access-h22r9\") on node \"crc\" DevicePath \"\"" Nov 25 15:51:20 crc kubenswrapper[4704]: I1125 15:51:20.436885 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt" event={"ID":"1f032655-f673-4d19-a90f-67d2e4cbc198","Type":"ContainerDied","Data":"b42b19842f3307800f73f5bbef0ef1369c9cc6f1a2769b4d351d3021c6c39f2a"} Nov 25 15:51:20 crc kubenswrapper[4704]: I1125 15:51:20.436927 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b42b19842f3307800f73f5bbef0ef1369c9cc6f1a2769b4d351d3021c6c39f2a" Nov 25 15:51:20 crc kubenswrapper[4704]: I1125 15:51:20.437003 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt" Nov 25 15:51:27 crc kubenswrapper[4704]: I1125 15:51:27.310240 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5684f64755-gm29f"] Nov 25 15:51:27 crc kubenswrapper[4704]: E1125 15:51:27.311549 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f032655-f673-4d19-a90f-67d2e4cbc198" containerName="pull" Nov 25 15:51:27 crc kubenswrapper[4704]: I1125 15:51:27.311572 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f032655-f673-4d19-a90f-67d2e4cbc198" containerName="pull" Nov 25 15:51:27 crc kubenswrapper[4704]: E1125 15:51:27.311600 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f032655-f673-4d19-a90f-67d2e4cbc198" containerName="util" Nov 25 15:51:27 crc kubenswrapper[4704]: I1125 15:51:27.311608 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f032655-f673-4d19-a90f-67d2e4cbc198" containerName="util" Nov 25 15:51:27 crc kubenswrapper[4704]: E1125 15:51:27.311623 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f032655-f673-4d19-a90f-67d2e4cbc198" containerName="extract" Nov 25 15:51:27 crc kubenswrapper[4704]: I1125 15:51:27.311632 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f032655-f673-4d19-a90f-67d2e4cbc198" containerName="extract" Nov 25 15:51:27 crc kubenswrapper[4704]: I1125 15:51:27.311769 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f032655-f673-4d19-a90f-67d2e4cbc198" containerName="extract" Nov 25 15:51:27 crc kubenswrapper[4704]: I1125 15:51:27.312396 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5684f64755-gm29f" Nov 25 15:51:27 crc kubenswrapper[4704]: I1125 15:51:27.316023 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-4kp4d" Nov 25 15:51:27 crc kubenswrapper[4704]: I1125 15:51:27.316296 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Nov 25 15:51:27 crc kubenswrapper[4704]: I1125 15:51:27.326884 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5684f64755-gm29f"] Nov 25 15:51:27 crc kubenswrapper[4704]: I1125 15:51:27.387015 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/440b2c47-17eb-4f7f-893e-7ccc849d2557-apiservice-cert\") pod \"keystone-operator-controller-manager-5684f64755-gm29f\" (UID: \"440b2c47-17eb-4f7f-893e-7ccc849d2557\") " pod="openstack-operators/keystone-operator-controller-manager-5684f64755-gm29f" Nov 25 15:51:27 crc kubenswrapper[4704]: I1125 15:51:27.387190 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/440b2c47-17eb-4f7f-893e-7ccc849d2557-webhook-cert\") pod \"keystone-operator-controller-manager-5684f64755-gm29f\" (UID: \"440b2c47-17eb-4f7f-893e-7ccc849d2557\") " pod="openstack-operators/keystone-operator-controller-manager-5684f64755-gm29f" Nov 25 15:51:27 crc kubenswrapper[4704]: I1125 15:51:27.387227 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmfwp\" (UniqueName: \"kubernetes.io/projected/440b2c47-17eb-4f7f-893e-7ccc849d2557-kube-api-access-hmfwp\") pod \"keystone-operator-controller-manager-5684f64755-gm29f\" (UID: \"440b2c47-17eb-4f7f-893e-7ccc849d2557\") " pod="openstack-operators/keystone-operator-controller-manager-5684f64755-gm29f" Nov 25 15:51:27 crc kubenswrapper[4704]: I1125 15:51:27.488571 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/440b2c47-17eb-4f7f-893e-7ccc849d2557-apiservice-cert\") pod \"keystone-operator-controller-manager-5684f64755-gm29f\" (UID: \"440b2c47-17eb-4f7f-893e-7ccc849d2557\") " pod="openstack-operators/keystone-operator-controller-manager-5684f64755-gm29f" Nov 25 15:51:27 crc kubenswrapper[4704]: I1125 15:51:27.488688 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/440b2c47-17eb-4f7f-893e-7ccc849d2557-webhook-cert\") pod \"keystone-operator-controller-manager-5684f64755-gm29f\" (UID: \"440b2c47-17eb-4f7f-893e-7ccc849d2557\") " pod="openstack-operators/keystone-operator-controller-manager-5684f64755-gm29f" Nov 25 15:51:27 crc kubenswrapper[4704]: I1125 15:51:27.488724 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmfwp\" (UniqueName: \"kubernetes.io/projected/440b2c47-17eb-4f7f-893e-7ccc849d2557-kube-api-access-hmfwp\") pod \"keystone-operator-controller-manager-5684f64755-gm29f\" (UID: \"440b2c47-17eb-4f7f-893e-7ccc849d2557\") " pod="openstack-operators/keystone-operator-controller-manager-5684f64755-gm29f" Nov 25 15:51:27 crc kubenswrapper[4704]: I1125 15:51:27.498556 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/440b2c47-17eb-4f7f-893e-7ccc849d2557-apiservice-cert\") pod \"keystone-operator-controller-manager-5684f64755-gm29f\" (UID: \"440b2c47-17eb-4f7f-893e-7ccc849d2557\") " pod="openstack-operators/keystone-operator-controller-manager-5684f64755-gm29f" Nov 25 15:51:27 crc kubenswrapper[4704]: I1125 15:51:27.498538 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/440b2c47-17eb-4f7f-893e-7ccc849d2557-webhook-cert\") pod \"keystone-operator-controller-manager-5684f64755-gm29f\" (UID: \"440b2c47-17eb-4f7f-893e-7ccc849d2557\") " pod="openstack-operators/keystone-operator-controller-manager-5684f64755-gm29f" Nov 25 15:51:27 crc kubenswrapper[4704]: I1125 15:51:27.512768 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmfwp\" (UniqueName: \"kubernetes.io/projected/440b2c47-17eb-4f7f-893e-7ccc849d2557-kube-api-access-hmfwp\") pod \"keystone-operator-controller-manager-5684f64755-gm29f\" (UID: \"440b2c47-17eb-4f7f-893e-7ccc849d2557\") " pod="openstack-operators/keystone-operator-controller-manager-5684f64755-gm29f" Nov 25 15:51:27 crc kubenswrapper[4704]: I1125 15:51:27.636361 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5684f64755-gm29f" Nov 25 15:51:28 crc kubenswrapper[4704]: I1125 15:51:28.104775 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5684f64755-gm29f"] Nov 25 15:51:28 crc kubenswrapper[4704]: I1125 15:51:28.485516 4704 generic.go:334] "Generic (PLEG): container finished" podID="bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a" containerID="7014b49fedf2a160b4c1ebcaa23210df82f1dab31bf65138faee01c49e17aa83" exitCode=0 Nov 25 15:51:28 crc kubenswrapper[4704]: I1125 15:51:28.485619 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a","Type":"ContainerDied","Data":"7014b49fedf2a160b4c1ebcaa23210df82f1dab31bf65138faee01c49e17aa83"} Nov 25 15:51:28 crc kubenswrapper[4704]: I1125 15:51:28.486661 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5684f64755-gm29f" event={"ID":"440b2c47-17eb-4f7f-893e-7ccc849d2557","Type":"ContainerStarted","Data":"c9445d9bd02866857ba98cc0794c8e5bd51bb1a4b216cb9ce17e6e11ec4b6b19"} Nov 25 15:51:29 crc kubenswrapper[4704]: I1125 15:51:29.494556 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a","Type":"ContainerStarted","Data":"bea6d572f79d1904e1267797db690741ccb6e2b2137f6b5aaf3af6484f707ec8"} Nov 25 15:51:29 crc kubenswrapper[4704]: I1125 15:51:29.495724 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/rabbitmq-server-0" Nov 25 15:51:29 crc kubenswrapper[4704]: I1125 15:51:29.524936 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/rabbitmq-server-0" podStartSLOduration=36.995938637 podStartE2EDuration="43.524917923s" podCreationTimestamp="2025-11-25 15:50:46 +0000 UTC" firstStartedPulling="2025-11-25 15:50:48.920412599 +0000 UTC m=+935.188686380" lastFinishedPulling="2025-11-25 15:50:55.449391885 +0000 UTC m=+941.717665666" observedRunningTime="2025-11-25 15:51:29.523677937 +0000 UTC m=+975.791951718" watchObservedRunningTime="2025-11-25 15:51:29.524917923 +0000 UTC m=+975.793191704" Nov 25 15:51:32 crc kubenswrapper[4704]: I1125 15:51:32.518548 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5684f64755-gm29f" event={"ID":"440b2c47-17eb-4f7f-893e-7ccc849d2557","Type":"ContainerStarted","Data":"2a2a5b0b5ce0a5e767b20810f2d2e689edbf5e09f3af5f2c30539c601a4e5e68"} Nov 25 15:51:32 crc kubenswrapper[4704]: I1125 15:51:32.519482 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5684f64755-gm29f" Nov 25 15:51:32 crc kubenswrapper[4704]: I1125 15:51:32.539371 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5684f64755-gm29f" podStartSLOduration=1.96126208 podStartE2EDuration="5.53935136s" podCreationTimestamp="2025-11-25 15:51:27 +0000 UTC" firstStartedPulling="2025-11-25 15:51:28.114929774 +0000 UTC m=+974.383203555" lastFinishedPulling="2025-11-25 15:51:31.693019054 +0000 UTC m=+977.961292835" observedRunningTime="2025-11-25 15:51:32.538592618 +0000 UTC m=+978.806866399" watchObservedRunningTime="2025-11-25 15:51:32.53935136 +0000 UTC m=+978.807625141" Nov 25 15:51:37 crc kubenswrapper[4704]: I1125 15:51:37.641500 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5684f64755-gm29f" Nov 25 15:51:38 crc kubenswrapper[4704]: I1125 15:51:38.396108 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-24cf-account-create-update-2429z"] Nov 25 15:51:38 crc kubenswrapper[4704]: I1125 15:51:38.397597 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-24cf-account-create-update-2429z" Nov 25 15:51:38 crc kubenswrapper[4704]: I1125 15:51:38.398606 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-24cf-account-create-update-2429z"] Nov 25 15:51:38 crc kubenswrapper[4704]: I1125 15:51:38.399942 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-db-secret" Nov 25 15:51:38 crc kubenswrapper[4704]: I1125 15:51:38.485496 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-db-create-vkffq"] Nov 25 15:51:38 crc kubenswrapper[4704]: I1125 15:51:38.486347 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-vkffq" Nov 25 15:51:38 crc kubenswrapper[4704]: I1125 15:51:38.494890 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-create-vkffq"] Nov 25 15:51:38 crc kubenswrapper[4704]: I1125 15:51:38.537051 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp4m9\" (UniqueName: \"kubernetes.io/projected/01b4516d-6cb3-47f0-8343-62f39b0d9e52-kube-api-access-xp4m9\") pod \"keystone-24cf-account-create-update-2429z\" (UID: \"01b4516d-6cb3-47f0-8343-62f39b0d9e52\") " pod="glance-kuttl-tests/keystone-24cf-account-create-update-2429z" Nov 25 15:51:38 crc kubenswrapper[4704]: I1125 15:51:38.537142 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01b4516d-6cb3-47f0-8343-62f39b0d9e52-operator-scripts\") pod \"keystone-24cf-account-create-update-2429z\" (UID: \"01b4516d-6cb3-47f0-8343-62f39b0d9e52\") " pod="glance-kuttl-tests/keystone-24cf-account-create-update-2429z" Nov 25 15:51:38 crc kubenswrapper[4704]: I1125 15:51:38.552373 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/rabbitmq-server-0" Nov 25 15:51:38 crc kubenswrapper[4704]: I1125 15:51:38.640054 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01b4516d-6cb3-47f0-8343-62f39b0d9e52-operator-scripts\") pod \"keystone-24cf-account-create-update-2429z\" (UID: \"01b4516d-6cb3-47f0-8343-62f39b0d9e52\") " pod="glance-kuttl-tests/keystone-24cf-account-create-update-2429z" Nov 25 15:51:38 crc kubenswrapper[4704]: I1125 15:51:38.640176 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c68eff74-0c7f-4150-b600-ffc3293b4e4d-operator-scripts\") pod \"keystone-db-create-vkffq\" (UID: \"c68eff74-0c7f-4150-b600-ffc3293b4e4d\") " pod="glance-kuttl-tests/keystone-db-create-vkffq" Nov 25 15:51:38 crc kubenswrapper[4704]: I1125 15:51:38.640314 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp4m9\" (UniqueName: \"kubernetes.io/projected/01b4516d-6cb3-47f0-8343-62f39b0d9e52-kube-api-access-xp4m9\") pod \"keystone-24cf-account-create-update-2429z\" (UID: \"01b4516d-6cb3-47f0-8343-62f39b0d9e52\") " pod="glance-kuttl-tests/keystone-24cf-account-create-update-2429z" Nov 25 15:51:38 crc kubenswrapper[4704]: I1125 15:51:38.640376 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtvsp\" (UniqueName: \"kubernetes.io/projected/c68eff74-0c7f-4150-b600-ffc3293b4e4d-kube-api-access-rtvsp\") pod \"keystone-db-create-vkffq\" (UID: \"c68eff74-0c7f-4150-b600-ffc3293b4e4d\") " pod="glance-kuttl-tests/keystone-db-create-vkffq" Nov 25 15:51:38 crc kubenswrapper[4704]: I1125 15:51:38.640917 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01b4516d-6cb3-47f0-8343-62f39b0d9e52-operator-scripts\") pod \"keystone-24cf-account-create-update-2429z\" (UID: \"01b4516d-6cb3-47f0-8343-62f39b0d9e52\") " pod="glance-kuttl-tests/keystone-24cf-account-create-update-2429z" Nov 25 15:51:38 crc kubenswrapper[4704]: I1125 15:51:38.661719 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp4m9\" (UniqueName: \"kubernetes.io/projected/01b4516d-6cb3-47f0-8343-62f39b0d9e52-kube-api-access-xp4m9\") pod \"keystone-24cf-account-create-update-2429z\" (UID: \"01b4516d-6cb3-47f0-8343-62f39b0d9e52\") " pod="glance-kuttl-tests/keystone-24cf-account-create-update-2429z" Nov 25 15:51:38 crc kubenswrapper[4704]: I1125 15:51:38.716552 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-24cf-account-create-update-2429z" Nov 25 15:51:38 crc kubenswrapper[4704]: I1125 15:51:38.742114 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtvsp\" (UniqueName: \"kubernetes.io/projected/c68eff74-0c7f-4150-b600-ffc3293b4e4d-kube-api-access-rtvsp\") pod \"keystone-db-create-vkffq\" (UID: \"c68eff74-0c7f-4150-b600-ffc3293b4e4d\") " pod="glance-kuttl-tests/keystone-db-create-vkffq" Nov 25 15:51:38 crc kubenswrapper[4704]: I1125 15:51:38.742203 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c68eff74-0c7f-4150-b600-ffc3293b4e4d-operator-scripts\") pod \"keystone-db-create-vkffq\" (UID: \"c68eff74-0c7f-4150-b600-ffc3293b4e4d\") " pod="glance-kuttl-tests/keystone-db-create-vkffq" Nov 25 15:51:38 crc kubenswrapper[4704]: I1125 15:51:38.743654 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c68eff74-0c7f-4150-b600-ffc3293b4e4d-operator-scripts\") pod \"keystone-db-create-vkffq\" (UID: \"c68eff74-0c7f-4150-b600-ffc3293b4e4d\") " pod="glance-kuttl-tests/keystone-db-create-vkffq" Nov 25 15:51:38 crc kubenswrapper[4704]: I1125 15:51:38.766700 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtvsp\" (UniqueName: \"kubernetes.io/projected/c68eff74-0c7f-4150-b600-ffc3293b4e4d-kube-api-access-rtvsp\") pod \"keystone-db-create-vkffq\" (UID: \"c68eff74-0c7f-4150-b600-ffc3293b4e4d\") " pod="glance-kuttl-tests/keystone-db-create-vkffq" Nov 25 15:51:38 crc kubenswrapper[4704]: I1125 15:51:38.811155 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-vkffq" Nov 25 15:51:38 crc kubenswrapper[4704]: I1125 15:51:38.965305 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-24cf-account-create-update-2429z"] Nov 25 15:51:38 crc kubenswrapper[4704]: W1125 15:51:38.977731 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01b4516d_6cb3_47f0_8343_62f39b0d9e52.slice/crio-cb814b160c2d0381876028e83a476c613f09d48119cb28a6c7b05527cbf469e8 WatchSource:0}: Error finding container cb814b160c2d0381876028e83a476c613f09d48119cb28a6c7b05527cbf469e8: Status 404 returned error can't find the container with id cb814b160c2d0381876028e83a476c613f09d48119cb28a6c7b05527cbf469e8 Nov 25 15:51:39 crc kubenswrapper[4704]: I1125 15:51:39.339876 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-create-vkffq"] Nov 25 15:51:39 crc kubenswrapper[4704]: W1125 15:51:39.352948 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc68eff74_0c7f_4150_b600_ffc3293b4e4d.slice/crio-92045f3dc94b4f2f7504efa92c14dfc05ed9e15642423be1e08a436f5c024e25 WatchSource:0}: Error finding container 92045f3dc94b4f2f7504efa92c14dfc05ed9e15642423be1e08a436f5c024e25: Status 404 returned error can't find the container with id 92045f3dc94b4f2f7504efa92c14dfc05ed9e15642423be1e08a436f5c024e25 Nov 25 15:51:39 crc kubenswrapper[4704]: I1125 15:51:39.561945 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-vkffq" event={"ID":"c68eff74-0c7f-4150-b600-ffc3293b4e4d","Type":"ContainerStarted","Data":"9405a28c037889ca82a7c5ce253c1599d92264f9345da36c713a826cde0efbfc"} Nov 25 15:51:39 crc kubenswrapper[4704]: I1125 15:51:39.562002 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-vkffq" event={"ID":"c68eff74-0c7f-4150-b600-ffc3293b4e4d","Type":"ContainerStarted","Data":"92045f3dc94b4f2f7504efa92c14dfc05ed9e15642423be1e08a436f5c024e25"} Nov 25 15:51:39 crc kubenswrapper[4704]: I1125 15:51:39.563477 4704 generic.go:334] "Generic (PLEG): container finished" podID="01b4516d-6cb3-47f0-8343-62f39b0d9e52" containerID="3f5f3abd3f3cba5b8fcd0b38d73e7ffa3cd584fe2cf8f637aba9b1e7b8b02eed" exitCode=0 Nov 25 15:51:39 crc kubenswrapper[4704]: I1125 15:51:39.563538 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-24cf-account-create-update-2429z" event={"ID":"01b4516d-6cb3-47f0-8343-62f39b0d9e52","Type":"ContainerDied","Data":"3f5f3abd3f3cba5b8fcd0b38d73e7ffa3cd584fe2cf8f637aba9b1e7b8b02eed"} Nov 25 15:51:39 crc kubenswrapper[4704]: I1125 15:51:39.563573 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-24cf-account-create-update-2429z" event={"ID":"01b4516d-6cb3-47f0-8343-62f39b0d9e52","Type":"ContainerStarted","Data":"cb814b160c2d0381876028e83a476c613f09d48119cb28a6c7b05527cbf469e8"} Nov 25 15:51:39 crc kubenswrapper[4704]: I1125 15:51:39.599829 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-db-create-vkffq" podStartSLOduration=1.599805612 podStartE2EDuration="1.599805612s" podCreationTimestamp="2025-11-25 15:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:51:39.581475691 +0000 UTC m=+985.849749492" watchObservedRunningTime="2025-11-25 15:51:39.599805612 +0000 UTC m=+985.868079403" Nov 25 15:51:40 crc kubenswrapper[4704]: I1125 15:51:40.572299 4704 generic.go:334] "Generic (PLEG): container finished" podID="c68eff74-0c7f-4150-b600-ffc3293b4e4d" containerID="9405a28c037889ca82a7c5ce253c1599d92264f9345da36c713a826cde0efbfc" exitCode=0 Nov 25 15:51:40 crc kubenswrapper[4704]: I1125 15:51:40.572491 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-vkffq" event={"ID":"c68eff74-0c7f-4150-b600-ffc3293b4e4d","Type":"ContainerDied","Data":"9405a28c037889ca82a7c5ce253c1599d92264f9345da36c713a826cde0efbfc"} Nov 25 15:51:41 crc kubenswrapper[4704]: I1125 15:51:41.022318 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-24cf-account-create-update-2429z" Nov 25 15:51:41 crc kubenswrapper[4704]: I1125 15:51:41.174717 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xp4m9\" (UniqueName: \"kubernetes.io/projected/01b4516d-6cb3-47f0-8343-62f39b0d9e52-kube-api-access-xp4m9\") pod \"01b4516d-6cb3-47f0-8343-62f39b0d9e52\" (UID: \"01b4516d-6cb3-47f0-8343-62f39b0d9e52\") " Nov 25 15:51:41 crc kubenswrapper[4704]: I1125 15:51:41.174892 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01b4516d-6cb3-47f0-8343-62f39b0d9e52-operator-scripts\") pod \"01b4516d-6cb3-47f0-8343-62f39b0d9e52\" (UID: \"01b4516d-6cb3-47f0-8343-62f39b0d9e52\") " Nov 25 15:51:41 crc kubenswrapper[4704]: I1125 15:51:41.176198 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01b4516d-6cb3-47f0-8343-62f39b0d9e52-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "01b4516d-6cb3-47f0-8343-62f39b0d9e52" (UID: "01b4516d-6cb3-47f0-8343-62f39b0d9e52"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:51:41 crc kubenswrapper[4704]: I1125 15:51:41.181904 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01b4516d-6cb3-47f0-8343-62f39b0d9e52-kube-api-access-xp4m9" (OuterVolumeSpecName: "kube-api-access-xp4m9") pod "01b4516d-6cb3-47f0-8343-62f39b0d9e52" (UID: "01b4516d-6cb3-47f0-8343-62f39b0d9e52"). InnerVolumeSpecName "kube-api-access-xp4m9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:51:41 crc kubenswrapper[4704]: I1125 15:51:41.276156 4704 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01b4516d-6cb3-47f0-8343-62f39b0d9e52-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:51:41 crc kubenswrapper[4704]: I1125 15:51:41.276208 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xp4m9\" (UniqueName: \"kubernetes.io/projected/01b4516d-6cb3-47f0-8343-62f39b0d9e52-kube-api-access-xp4m9\") on node \"crc\" DevicePath \"\"" Nov 25 15:51:41 crc kubenswrapper[4704]: I1125 15:51:41.592829 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-24cf-account-create-update-2429z" Nov 25 15:51:41 crc kubenswrapper[4704]: I1125 15:51:41.592769 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-24cf-account-create-update-2429z" event={"ID":"01b4516d-6cb3-47f0-8343-62f39b0d9e52","Type":"ContainerDied","Data":"cb814b160c2d0381876028e83a476c613f09d48119cb28a6c7b05527cbf469e8"} Nov 25 15:51:41 crc kubenswrapper[4704]: I1125 15:51:41.592896 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb814b160c2d0381876028e83a476c613f09d48119cb28a6c7b05527cbf469e8" Nov 25 15:51:41 crc kubenswrapper[4704]: I1125 15:51:41.880506 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-vkffq" Nov 25 15:51:41 crc kubenswrapper[4704]: I1125 15:51:41.997938 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c68eff74-0c7f-4150-b600-ffc3293b4e4d-operator-scripts\") pod \"c68eff74-0c7f-4150-b600-ffc3293b4e4d\" (UID: \"c68eff74-0c7f-4150-b600-ffc3293b4e4d\") " Nov 25 15:51:41 crc kubenswrapper[4704]: I1125 15:51:41.998039 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtvsp\" (UniqueName: \"kubernetes.io/projected/c68eff74-0c7f-4150-b600-ffc3293b4e4d-kube-api-access-rtvsp\") pod \"c68eff74-0c7f-4150-b600-ffc3293b4e4d\" (UID: \"c68eff74-0c7f-4150-b600-ffc3293b4e4d\") " Nov 25 15:51:41 crc kubenswrapper[4704]: I1125 15:51:41.999536 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c68eff74-0c7f-4150-b600-ffc3293b4e4d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c68eff74-0c7f-4150-b600-ffc3293b4e4d" (UID: "c68eff74-0c7f-4150-b600-ffc3293b4e4d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:51:42 crc kubenswrapper[4704]: I1125 15:51:42.003077 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c68eff74-0c7f-4150-b600-ffc3293b4e4d-kube-api-access-rtvsp" (OuterVolumeSpecName: "kube-api-access-rtvsp") pod "c68eff74-0c7f-4150-b600-ffc3293b4e4d" (UID: "c68eff74-0c7f-4150-b600-ffc3293b4e4d"). InnerVolumeSpecName "kube-api-access-rtvsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:51:42 crc kubenswrapper[4704]: I1125 15:51:42.099404 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtvsp\" (UniqueName: \"kubernetes.io/projected/c68eff74-0c7f-4150-b600-ffc3293b4e4d-kube-api-access-rtvsp\") on node \"crc\" DevicePath \"\"" Nov 25 15:51:42 crc kubenswrapper[4704]: I1125 15:51:42.099452 4704 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c68eff74-0c7f-4150-b600-ffc3293b4e4d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:51:42 crc kubenswrapper[4704]: I1125 15:51:42.600210 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-vkffq" event={"ID":"c68eff74-0c7f-4150-b600-ffc3293b4e4d","Type":"ContainerDied","Data":"92045f3dc94b4f2f7504efa92c14dfc05ed9e15642423be1e08a436f5c024e25"} Nov 25 15:51:42 crc kubenswrapper[4704]: I1125 15:51:42.600726 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92045f3dc94b4f2f7504efa92c14dfc05ed9e15642423be1e08a436f5c024e25" Nov 25 15:51:42 crc kubenswrapper[4704]: I1125 15:51:42.600287 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-vkffq" Nov 25 15:51:44 crc kubenswrapper[4704]: I1125 15:51:44.037055 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-db-sync-k6cq8"] Nov 25 15:51:44 crc kubenswrapper[4704]: E1125 15:51:44.037327 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c68eff74-0c7f-4150-b600-ffc3293b4e4d" containerName="mariadb-database-create" Nov 25 15:51:44 crc kubenswrapper[4704]: I1125 15:51:44.037346 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="c68eff74-0c7f-4150-b600-ffc3293b4e4d" containerName="mariadb-database-create" Nov 25 15:51:44 crc kubenswrapper[4704]: E1125 15:51:44.037359 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01b4516d-6cb3-47f0-8343-62f39b0d9e52" containerName="mariadb-account-create-update" Nov 25 15:51:44 crc kubenswrapper[4704]: I1125 15:51:44.037366 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b4516d-6cb3-47f0-8343-62f39b0d9e52" containerName="mariadb-account-create-update" Nov 25 15:51:44 crc kubenswrapper[4704]: I1125 15:51:44.037523 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="c68eff74-0c7f-4150-b600-ffc3293b4e4d" containerName="mariadb-database-create" Nov 25 15:51:44 crc kubenswrapper[4704]: I1125 15:51:44.037538 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="01b4516d-6cb3-47f0-8343-62f39b0d9e52" containerName="mariadb-account-create-update" Nov 25 15:51:44 crc kubenswrapper[4704]: I1125 15:51:44.038078 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-k6cq8" Nov 25 15:51:44 crc kubenswrapper[4704]: I1125 15:51:44.040464 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Nov 25 15:51:44 crc kubenswrapper[4704]: I1125 15:51:44.040883 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-78ztb" Nov 25 15:51:44 crc kubenswrapper[4704]: I1125 15:51:44.041162 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Nov 25 15:51:44 crc kubenswrapper[4704]: I1125 15:51:44.041403 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Nov 25 15:51:44 crc kubenswrapper[4704]: I1125 15:51:44.050641 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-k6cq8"] Nov 25 15:51:44 crc kubenswrapper[4704]: I1125 15:51:44.225416 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46fbp\" (UniqueName: \"kubernetes.io/projected/4240ea8b-01f6-4a52-99e8-f985830dacd9-kube-api-access-46fbp\") pod \"keystone-db-sync-k6cq8\" (UID: \"4240ea8b-01f6-4a52-99e8-f985830dacd9\") " pod="glance-kuttl-tests/keystone-db-sync-k6cq8" Nov 25 15:51:44 crc kubenswrapper[4704]: I1125 15:51:44.226055 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4240ea8b-01f6-4a52-99e8-f985830dacd9-config-data\") pod \"keystone-db-sync-k6cq8\" (UID: \"4240ea8b-01f6-4a52-99e8-f985830dacd9\") " pod="glance-kuttl-tests/keystone-db-sync-k6cq8" Nov 25 15:51:44 crc kubenswrapper[4704]: I1125 15:51:44.327917 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4240ea8b-01f6-4a52-99e8-f985830dacd9-config-data\") pod \"keystone-db-sync-k6cq8\" (UID: \"4240ea8b-01f6-4a52-99e8-f985830dacd9\") " pod="glance-kuttl-tests/keystone-db-sync-k6cq8" Nov 25 15:51:44 crc kubenswrapper[4704]: I1125 15:51:44.328043 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46fbp\" (UniqueName: \"kubernetes.io/projected/4240ea8b-01f6-4a52-99e8-f985830dacd9-kube-api-access-46fbp\") pod \"keystone-db-sync-k6cq8\" (UID: \"4240ea8b-01f6-4a52-99e8-f985830dacd9\") " pod="glance-kuttl-tests/keystone-db-sync-k6cq8" Nov 25 15:51:44 crc kubenswrapper[4704]: I1125 15:51:44.339831 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4240ea8b-01f6-4a52-99e8-f985830dacd9-config-data\") pod \"keystone-db-sync-k6cq8\" (UID: \"4240ea8b-01f6-4a52-99e8-f985830dacd9\") " pod="glance-kuttl-tests/keystone-db-sync-k6cq8" Nov 25 15:51:44 crc kubenswrapper[4704]: I1125 15:51:44.345119 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46fbp\" (UniqueName: \"kubernetes.io/projected/4240ea8b-01f6-4a52-99e8-f985830dacd9-kube-api-access-46fbp\") pod \"keystone-db-sync-k6cq8\" (UID: \"4240ea8b-01f6-4a52-99e8-f985830dacd9\") " pod="glance-kuttl-tests/keystone-db-sync-k6cq8" Nov 25 15:51:44 crc kubenswrapper[4704]: I1125 15:51:44.359292 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-k6cq8" Nov 25 15:51:44 crc kubenswrapper[4704]: I1125 15:51:44.828255 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-k6cq8"] Nov 25 15:51:44 crc kubenswrapper[4704]: W1125 15:51:44.841009 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4240ea8b_01f6_4a52_99e8_f985830dacd9.slice/crio-6fee391f2a4b9a00a8bf0e68f9e300974de4d8c52cc92c88feddfe10d78b2de4 WatchSource:0}: Error finding container 6fee391f2a4b9a00a8bf0e68f9e300974de4d8c52cc92c88feddfe10d78b2de4: Status 404 returned error can't find the container with id 6fee391f2a4b9a00a8bf0e68f9e300974de4d8c52cc92c88feddfe10d78b2de4 Nov 25 15:51:45 crc kubenswrapper[4704]: I1125 15:51:45.623065 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-k6cq8" event={"ID":"4240ea8b-01f6-4a52-99e8-f985830dacd9","Type":"ContainerStarted","Data":"6fee391f2a4b9a00a8bf0e68f9e300974de4d8c52cc92c88feddfe10d78b2de4"} Nov 25 15:51:51 crc kubenswrapper[4704]: I1125 15:51:51.409616 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-index-24d6s"] Nov 25 15:51:51 crc kubenswrapper[4704]: I1125 15:51:51.411570 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-24d6s" Nov 25 15:51:51 crc kubenswrapper[4704]: I1125 15:51:51.414423 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-index-dockercfg-hspjw" Nov 25 15:51:51 crc kubenswrapper[4704]: I1125 15:51:51.416419 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-24d6s"] Nov 25 15:51:51 crc kubenswrapper[4704]: I1125 15:51:51.457666 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbcl6\" (UniqueName: \"kubernetes.io/projected/8200c73c-66eb-457b-8cdc-c3773b532d29-kube-api-access-qbcl6\") pod \"horizon-operator-index-24d6s\" (UID: \"8200c73c-66eb-457b-8cdc-c3773b532d29\") " pod="openstack-operators/horizon-operator-index-24d6s" Nov 25 15:51:51 crc kubenswrapper[4704]: I1125 15:51:51.558716 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbcl6\" (UniqueName: \"kubernetes.io/projected/8200c73c-66eb-457b-8cdc-c3773b532d29-kube-api-access-qbcl6\") pod \"horizon-operator-index-24d6s\" (UID: \"8200c73c-66eb-457b-8cdc-c3773b532d29\") " pod="openstack-operators/horizon-operator-index-24d6s" Nov 25 15:51:51 crc kubenswrapper[4704]: I1125 15:51:51.582474 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbcl6\" (UniqueName: \"kubernetes.io/projected/8200c73c-66eb-457b-8cdc-c3773b532d29-kube-api-access-qbcl6\") pod \"horizon-operator-index-24d6s\" (UID: \"8200c73c-66eb-457b-8cdc-c3773b532d29\") " pod="openstack-operators/horizon-operator-index-24d6s" Nov 25 15:51:51 crc kubenswrapper[4704]: I1125 15:51:51.767876 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-24d6s" Nov 25 15:51:52 crc kubenswrapper[4704]: I1125 15:51:52.801825 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-index-9fzqh"] Nov 25 15:51:52 crc kubenswrapper[4704]: I1125 15:51:52.803363 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-9fzqh" Nov 25 15:51:52 crc kubenswrapper[4704]: I1125 15:51:52.806113 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-index-dockercfg-wxmct" Nov 25 15:51:52 crc kubenswrapper[4704]: I1125 15:51:52.811664 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-9fzqh"] Nov 25 15:51:52 crc kubenswrapper[4704]: I1125 15:51:52.873802 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg7s6\" (UniqueName: \"kubernetes.io/projected/3988efd6-25d0-48aa-8750-aad3d5d9c525-kube-api-access-wg7s6\") pod \"swift-operator-index-9fzqh\" (UID: \"3988efd6-25d0-48aa-8750-aad3d5d9c525\") " pod="openstack-operators/swift-operator-index-9fzqh" Nov 25 15:51:52 crc kubenswrapper[4704]: I1125 15:51:52.975412 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg7s6\" (UniqueName: \"kubernetes.io/projected/3988efd6-25d0-48aa-8750-aad3d5d9c525-kube-api-access-wg7s6\") pod \"swift-operator-index-9fzqh\" (UID: \"3988efd6-25d0-48aa-8750-aad3d5d9c525\") " pod="openstack-operators/swift-operator-index-9fzqh" Nov 25 15:51:52 crc kubenswrapper[4704]: I1125 15:51:52.997365 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg7s6\" (UniqueName: \"kubernetes.io/projected/3988efd6-25d0-48aa-8750-aad3d5d9c525-kube-api-access-wg7s6\") pod \"swift-operator-index-9fzqh\" (UID: \"3988efd6-25d0-48aa-8750-aad3d5d9c525\") " pod="openstack-operators/swift-operator-index-9fzqh" Nov 25 15:51:53 crc kubenswrapper[4704]: I1125 15:51:53.179107 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-9fzqh" Nov 25 15:51:53 crc kubenswrapper[4704]: I1125 15:51:53.531553 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-24d6s"] Nov 25 15:51:53 crc kubenswrapper[4704]: W1125 15:51:53.549646 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8200c73c_66eb_457b_8cdc_c3773b532d29.slice/crio-186a72d3c6e63850723fa2d474c47d7dae8fc78025dcb0f9703662f753abcb38 WatchSource:0}: Error finding container 186a72d3c6e63850723fa2d474c47d7dae8fc78025dcb0f9703662f753abcb38: Status 404 returned error can't find the container with id 186a72d3c6e63850723fa2d474c47d7dae8fc78025dcb0f9703662f753abcb38 Nov 25 15:51:53 crc kubenswrapper[4704]: I1125 15:51:53.677765 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-k6cq8" event={"ID":"4240ea8b-01f6-4a52-99e8-f985830dacd9","Type":"ContainerStarted","Data":"6694d415abf5b9c4894cc8d83ec1ee7122e2fe70c2d9082ac3a9d8f9702e89fc"} Nov 25 15:51:53 crc kubenswrapper[4704]: I1125 15:51:53.679391 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-24d6s" event={"ID":"8200c73c-66eb-457b-8cdc-c3773b532d29","Type":"ContainerStarted","Data":"186a72d3c6e63850723fa2d474c47d7dae8fc78025dcb0f9703662f753abcb38"} Nov 25 15:51:53 crc kubenswrapper[4704]: I1125 15:51:53.797498 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-9fzqh"] Nov 25 15:51:53 crc kubenswrapper[4704]: W1125 15:51:53.800633 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3988efd6_25d0_48aa_8750_aad3d5d9c525.slice/crio-29855269c5c4aa039fdc91dfd81f97650e48012f7961ed639d1a41c0ee6df1a0 WatchSource:0}: Error finding container 29855269c5c4aa039fdc91dfd81f97650e48012f7961ed639d1a41c0ee6df1a0: Status 404 returned error can't find the container with id 29855269c5c4aa039fdc91dfd81f97650e48012f7961ed639d1a41c0ee6df1a0 Nov 25 15:51:54 crc kubenswrapper[4704]: I1125 15:51:54.714120 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-9fzqh" event={"ID":"3988efd6-25d0-48aa-8750-aad3d5d9c525","Type":"ContainerStarted","Data":"29855269c5c4aa039fdc91dfd81f97650e48012f7961ed639d1a41c0ee6df1a0"} Nov 25 15:51:54 crc kubenswrapper[4704]: I1125 15:51:54.740080 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-db-sync-k6cq8" podStartSLOduration=2.202214477 podStartE2EDuration="10.740061721s" podCreationTimestamp="2025-11-25 15:51:44 +0000 UTC" firstStartedPulling="2025-11-25 15:51:44.843402878 +0000 UTC m=+991.111676659" lastFinishedPulling="2025-11-25 15:51:53.381250122 +0000 UTC m=+999.649523903" observedRunningTime="2025-11-25 15:51:54.738044422 +0000 UTC m=+1001.006318223" watchObservedRunningTime="2025-11-25 15:51:54.740061721 +0000 UTC m=+1001.008335492" Nov 25 15:51:55 crc kubenswrapper[4704]: I1125 15:51:55.722073 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-24d6s" event={"ID":"8200c73c-66eb-457b-8cdc-c3773b532d29","Type":"ContainerStarted","Data":"263d5a2025df9c7982aff2ca530eb919141dd95fab7fe45275028db13dd0eb64"} Nov 25 15:51:55 crc kubenswrapper[4704]: I1125 15:51:55.723807 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-9fzqh" event={"ID":"3988efd6-25d0-48aa-8750-aad3d5d9c525","Type":"ContainerStarted","Data":"d60616f9ac9f41201cbd07d233c7d5b859ca72030d0cad8738893f57e4c0ee72"} Nov 25 15:51:55 crc kubenswrapper[4704]: I1125 15:51:55.740090 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-index-24d6s" podStartSLOduration=2.944757062 podStartE2EDuration="4.740063073s" podCreationTimestamp="2025-11-25 15:51:51 +0000 UTC" firstStartedPulling="2025-11-25 15:51:53.551099348 +0000 UTC m=+999.819373129" lastFinishedPulling="2025-11-25 15:51:55.346405359 +0000 UTC m=+1001.614679140" observedRunningTime="2025-11-25 15:51:55.735930264 +0000 UTC m=+1002.004204045" watchObservedRunningTime="2025-11-25 15:51:55.740063073 +0000 UTC m=+1002.008336854" Nov 25 15:51:55 crc kubenswrapper[4704]: I1125 15:51:55.754513 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-index-9fzqh" podStartSLOduration=2.387462244 podStartE2EDuration="3.754491471s" podCreationTimestamp="2025-11-25 15:51:52 +0000 UTC" firstStartedPulling="2025-11-25 15:51:53.802567176 +0000 UTC m=+1000.070840957" lastFinishedPulling="2025-11-25 15:51:55.169596403 +0000 UTC m=+1001.437870184" observedRunningTime="2025-11-25 15:51:55.754200503 +0000 UTC m=+1002.022474284" watchObservedRunningTime="2025-11-25 15:51:55.754491471 +0000 UTC m=+1002.022765252" Nov 25 15:51:58 crc kubenswrapper[4704]: I1125 15:51:58.753385 4704 generic.go:334] "Generic (PLEG): container finished" podID="4240ea8b-01f6-4a52-99e8-f985830dacd9" containerID="6694d415abf5b9c4894cc8d83ec1ee7122e2fe70c2d9082ac3a9d8f9702e89fc" exitCode=0 Nov 25 15:51:58 crc kubenswrapper[4704]: I1125 15:51:58.753488 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-k6cq8" event={"ID":"4240ea8b-01f6-4a52-99e8-f985830dacd9","Type":"ContainerDied","Data":"6694d415abf5b9c4894cc8d83ec1ee7122e2fe70c2d9082ac3a9d8f9702e89fc"} Nov 25 15:52:00 crc kubenswrapper[4704]: I1125 15:52:00.083075 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-k6cq8" Nov 25 15:52:00 crc kubenswrapper[4704]: I1125 15:52:00.213573 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46fbp\" (UniqueName: \"kubernetes.io/projected/4240ea8b-01f6-4a52-99e8-f985830dacd9-kube-api-access-46fbp\") pod \"4240ea8b-01f6-4a52-99e8-f985830dacd9\" (UID: \"4240ea8b-01f6-4a52-99e8-f985830dacd9\") " Nov 25 15:52:00 crc kubenswrapper[4704]: I1125 15:52:00.214238 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4240ea8b-01f6-4a52-99e8-f985830dacd9-config-data\") pod \"4240ea8b-01f6-4a52-99e8-f985830dacd9\" (UID: \"4240ea8b-01f6-4a52-99e8-f985830dacd9\") " Nov 25 15:52:00 crc kubenswrapper[4704]: I1125 15:52:00.219497 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4240ea8b-01f6-4a52-99e8-f985830dacd9-kube-api-access-46fbp" (OuterVolumeSpecName: "kube-api-access-46fbp") pod "4240ea8b-01f6-4a52-99e8-f985830dacd9" (UID: "4240ea8b-01f6-4a52-99e8-f985830dacd9"). InnerVolumeSpecName "kube-api-access-46fbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:52:00 crc kubenswrapper[4704]: I1125 15:52:00.247024 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4240ea8b-01f6-4a52-99e8-f985830dacd9-config-data" (OuterVolumeSpecName: "config-data") pod "4240ea8b-01f6-4a52-99e8-f985830dacd9" (UID: "4240ea8b-01f6-4a52-99e8-f985830dacd9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:52:00 crc kubenswrapper[4704]: I1125 15:52:00.315647 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46fbp\" (UniqueName: \"kubernetes.io/projected/4240ea8b-01f6-4a52-99e8-f985830dacd9-kube-api-access-46fbp\") on node \"crc\" DevicePath \"\"" Nov 25 15:52:00 crc kubenswrapper[4704]: I1125 15:52:00.315687 4704 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4240ea8b-01f6-4a52-99e8-f985830dacd9-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:52:00 crc kubenswrapper[4704]: I1125 15:52:00.768462 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-k6cq8" event={"ID":"4240ea8b-01f6-4a52-99e8-f985830dacd9","Type":"ContainerDied","Data":"6fee391f2a4b9a00a8bf0e68f9e300974de4d8c52cc92c88feddfe10d78b2de4"} Nov 25 15:52:00 crc kubenswrapper[4704]: I1125 15:52:00.768507 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fee391f2a4b9a00a8bf0e68f9e300974de4d8c52cc92c88feddfe10d78b2de4" Nov 25 15:52:00 crc kubenswrapper[4704]: I1125 15:52:00.768529 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-k6cq8" Nov 25 15:52:00 crc kubenswrapper[4704]: I1125 15:52:00.960658 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-ftgpq"] Nov 25 15:52:00 crc kubenswrapper[4704]: E1125 15:52:00.960925 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4240ea8b-01f6-4a52-99e8-f985830dacd9" containerName="keystone-db-sync" Nov 25 15:52:00 crc kubenswrapper[4704]: I1125 15:52:00.960938 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="4240ea8b-01f6-4a52-99e8-f985830dacd9" containerName="keystone-db-sync" Nov 25 15:52:00 crc kubenswrapper[4704]: I1125 15:52:00.961059 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="4240ea8b-01f6-4a52-99e8-f985830dacd9" containerName="keystone-db-sync" Nov 25 15:52:00 crc kubenswrapper[4704]: I1125 15:52:00.961522 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-ftgpq" Nov 25 15:52:00 crc kubenswrapper[4704]: I1125 15:52:00.963923 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Nov 25 15:52:00 crc kubenswrapper[4704]: I1125 15:52:00.963938 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-78ztb" Nov 25 15:52:00 crc kubenswrapper[4704]: I1125 15:52:00.965535 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Nov 25 15:52:00 crc kubenswrapper[4704]: I1125 15:52:00.965600 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"osp-secret" Nov 25 15:52:00 crc kubenswrapper[4704]: I1125 15:52:00.970527 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Nov 25 15:52:00 crc kubenswrapper[4704]: I1125 15:52:00.979359 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-ftgpq"] Nov 25 15:52:01 crc kubenswrapper[4704]: I1125 15:52:01.024471 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/95561520-558e-4407-9c6c-f76abdf194a7-fernet-keys\") pod \"keystone-bootstrap-ftgpq\" (UID: \"95561520-558e-4407-9c6c-f76abdf194a7\") " pod="glance-kuttl-tests/keystone-bootstrap-ftgpq" Nov 25 15:52:01 crc kubenswrapper[4704]: I1125 15:52:01.024523 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95561520-558e-4407-9c6c-f76abdf194a7-config-data\") pod \"keystone-bootstrap-ftgpq\" (UID: \"95561520-558e-4407-9c6c-f76abdf194a7\") " pod="glance-kuttl-tests/keystone-bootstrap-ftgpq" Nov 25 15:52:01 crc kubenswrapper[4704]: I1125 15:52:01.024581 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95561520-558e-4407-9c6c-f76abdf194a7-scripts\") pod \"keystone-bootstrap-ftgpq\" (UID: \"95561520-558e-4407-9c6c-f76abdf194a7\") " pod="glance-kuttl-tests/keystone-bootstrap-ftgpq" Nov 25 15:52:01 crc kubenswrapper[4704]: I1125 15:52:01.024601 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpchx\" (UniqueName: \"kubernetes.io/projected/95561520-558e-4407-9c6c-f76abdf194a7-kube-api-access-kpchx\") pod \"keystone-bootstrap-ftgpq\" (UID: \"95561520-558e-4407-9c6c-f76abdf194a7\") " pod="glance-kuttl-tests/keystone-bootstrap-ftgpq" Nov 25 15:52:01 crc kubenswrapper[4704]: I1125 15:52:01.024633 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/95561520-558e-4407-9c6c-f76abdf194a7-credential-keys\") pod \"keystone-bootstrap-ftgpq\" (UID: \"95561520-558e-4407-9c6c-f76abdf194a7\") " pod="glance-kuttl-tests/keystone-bootstrap-ftgpq" Nov 25 15:52:01 crc kubenswrapper[4704]: I1125 15:52:01.125409 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95561520-558e-4407-9c6c-f76abdf194a7-scripts\") pod \"keystone-bootstrap-ftgpq\" (UID: \"95561520-558e-4407-9c6c-f76abdf194a7\") " pod="glance-kuttl-tests/keystone-bootstrap-ftgpq" Nov 25 15:52:01 crc kubenswrapper[4704]: I1125 15:52:01.125516 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpchx\" (UniqueName: \"kubernetes.io/projected/95561520-558e-4407-9c6c-f76abdf194a7-kube-api-access-kpchx\") pod \"keystone-bootstrap-ftgpq\" (UID: \"95561520-558e-4407-9c6c-f76abdf194a7\") " pod="glance-kuttl-tests/keystone-bootstrap-ftgpq" Nov 25 15:52:01 crc kubenswrapper[4704]: I1125 15:52:01.125561 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/95561520-558e-4407-9c6c-f76abdf194a7-credential-keys\") pod \"keystone-bootstrap-ftgpq\" (UID: \"95561520-558e-4407-9c6c-f76abdf194a7\") " pod="glance-kuttl-tests/keystone-bootstrap-ftgpq" Nov 25 15:52:01 crc kubenswrapper[4704]: I1125 15:52:01.125591 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/95561520-558e-4407-9c6c-f76abdf194a7-fernet-keys\") pod \"keystone-bootstrap-ftgpq\" (UID: \"95561520-558e-4407-9c6c-f76abdf194a7\") " pod="glance-kuttl-tests/keystone-bootstrap-ftgpq" Nov 25 15:52:01 crc kubenswrapper[4704]: I1125 15:52:01.125617 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95561520-558e-4407-9c6c-f76abdf194a7-config-data\") pod \"keystone-bootstrap-ftgpq\" (UID: \"95561520-558e-4407-9c6c-f76abdf194a7\") " pod="glance-kuttl-tests/keystone-bootstrap-ftgpq" Nov 25 15:52:01 crc kubenswrapper[4704]: I1125 15:52:01.132999 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95561520-558e-4407-9c6c-f76abdf194a7-scripts\") pod \"keystone-bootstrap-ftgpq\" (UID: \"95561520-558e-4407-9c6c-f76abdf194a7\") " pod="glance-kuttl-tests/keystone-bootstrap-ftgpq" Nov 25 15:52:01 crc kubenswrapper[4704]: I1125 15:52:01.133091 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/95561520-558e-4407-9c6c-f76abdf194a7-credential-keys\") pod \"keystone-bootstrap-ftgpq\" (UID: \"95561520-558e-4407-9c6c-f76abdf194a7\") " pod="glance-kuttl-tests/keystone-bootstrap-ftgpq" Nov 25 15:52:01 crc kubenswrapper[4704]: I1125 15:52:01.133468 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/95561520-558e-4407-9c6c-f76abdf194a7-fernet-keys\") pod \"keystone-bootstrap-ftgpq\" (UID: \"95561520-558e-4407-9c6c-f76abdf194a7\") " pod="glance-kuttl-tests/keystone-bootstrap-ftgpq" Nov 25 15:52:01 crc kubenswrapper[4704]: I1125 15:52:01.138373 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95561520-558e-4407-9c6c-f76abdf194a7-config-data\") pod \"keystone-bootstrap-ftgpq\" (UID: \"95561520-558e-4407-9c6c-f76abdf194a7\") " pod="glance-kuttl-tests/keystone-bootstrap-ftgpq" Nov 25 15:52:01 crc kubenswrapper[4704]: I1125 15:52:01.148688 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpchx\" (UniqueName: \"kubernetes.io/projected/95561520-558e-4407-9c6c-f76abdf194a7-kube-api-access-kpchx\") pod \"keystone-bootstrap-ftgpq\" (UID: \"95561520-558e-4407-9c6c-f76abdf194a7\") " pod="glance-kuttl-tests/keystone-bootstrap-ftgpq" Nov 25 15:52:01 crc kubenswrapper[4704]: I1125 15:52:01.278115 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-ftgpq" Nov 25 15:52:01 crc kubenswrapper[4704]: I1125 15:52:01.686043 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-ftgpq"] Nov 25 15:52:01 crc kubenswrapper[4704]: I1125 15:52:01.768913 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/horizon-operator-index-24d6s" Nov 25 15:52:01 crc kubenswrapper[4704]: I1125 15:52:01.769438 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-index-24d6s" Nov 25 15:52:01 crc kubenswrapper[4704]: I1125 15:52:01.783846 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-ftgpq" event={"ID":"95561520-558e-4407-9c6c-f76abdf194a7","Type":"ContainerStarted","Data":"7dd15fc71ee7eddf50354d3402a111684801716ac9aad46248d450eb8cfb16a1"} Nov 25 15:52:01 crc kubenswrapper[4704]: I1125 15:52:01.802060 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/horizon-operator-index-24d6s" Nov 25 15:52:02 crc kubenswrapper[4704]: I1125 15:52:02.798376 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-ftgpq" event={"ID":"95561520-558e-4407-9c6c-f76abdf194a7","Type":"ContainerStarted","Data":"9344d7f21129ef3164497f4db662d15a6a375a283fc05ff61eb73a31ceda615c"} Nov 25 15:52:02 crc kubenswrapper[4704]: I1125 15:52:02.816493 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-bootstrap-ftgpq" podStartSLOduration=2.816474608 podStartE2EDuration="2.816474608s" podCreationTimestamp="2025-11-25 15:52:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:52:02.814031517 +0000 UTC m=+1009.082305318" watchObservedRunningTime="2025-11-25 15:52:02.816474608 +0000 UTC m=+1009.084748389" Nov 25 15:52:02 crc kubenswrapper[4704]: I1125 15:52:02.829346 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-index-24d6s" Nov 25 15:52:03 crc kubenswrapper[4704]: I1125 15:52:03.179934 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-index-9fzqh" Nov 25 15:52:03 crc kubenswrapper[4704]: I1125 15:52:03.180218 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/swift-operator-index-9fzqh" Nov 25 15:52:03 crc kubenswrapper[4704]: I1125 15:52:03.212726 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/swift-operator-index-9fzqh" Nov 25 15:52:03 crc kubenswrapper[4704]: I1125 15:52:03.832716 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-index-9fzqh" Nov 25 15:52:04 crc kubenswrapper[4704]: I1125 15:52:04.810437 4704 generic.go:334] "Generic (PLEG): container finished" podID="95561520-558e-4407-9c6c-f76abdf194a7" containerID="9344d7f21129ef3164497f4db662d15a6a375a283fc05ff61eb73a31ceda615c" exitCode=0 Nov 25 15:52:04 crc kubenswrapper[4704]: I1125 15:52:04.810542 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-ftgpq" event={"ID":"95561520-558e-4407-9c6c-f76abdf194a7","Type":"ContainerDied","Data":"9344d7f21129ef3164497f4db662d15a6a375a283fc05ff61eb73a31ceda615c"} Nov 25 15:52:06 crc kubenswrapper[4704]: I1125 15:52:06.122991 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-ftgpq" Nov 25 15:52:06 crc kubenswrapper[4704]: I1125 15:52:06.294451 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpchx\" (UniqueName: \"kubernetes.io/projected/95561520-558e-4407-9c6c-f76abdf194a7-kube-api-access-kpchx\") pod \"95561520-558e-4407-9c6c-f76abdf194a7\" (UID: \"95561520-558e-4407-9c6c-f76abdf194a7\") " Nov 25 15:52:06 crc kubenswrapper[4704]: I1125 15:52:06.294521 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95561520-558e-4407-9c6c-f76abdf194a7-config-data\") pod \"95561520-558e-4407-9c6c-f76abdf194a7\" (UID: \"95561520-558e-4407-9c6c-f76abdf194a7\") " Nov 25 15:52:06 crc kubenswrapper[4704]: I1125 15:52:06.294566 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/95561520-558e-4407-9c6c-f76abdf194a7-credential-keys\") pod \"95561520-558e-4407-9c6c-f76abdf194a7\" (UID: \"95561520-558e-4407-9c6c-f76abdf194a7\") " Nov 25 15:52:06 crc kubenswrapper[4704]: I1125 15:52:06.294603 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/95561520-558e-4407-9c6c-f76abdf194a7-fernet-keys\") pod \"95561520-558e-4407-9c6c-f76abdf194a7\" (UID: \"95561520-558e-4407-9c6c-f76abdf194a7\") " Nov 25 15:52:06 crc kubenswrapper[4704]: I1125 15:52:06.294676 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95561520-558e-4407-9c6c-f76abdf194a7-scripts\") pod \"95561520-558e-4407-9c6c-f76abdf194a7\" (UID: \"95561520-558e-4407-9c6c-f76abdf194a7\") " Nov 25 15:52:06 crc kubenswrapper[4704]: I1125 15:52:06.300607 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95561520-558e-4407-9c6c-f76abdf194a7-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "95561520-558e-4407-9c6c-f76abdf194a7" (UID: "95561520-558e-4407-9c6c-f76abdf194a7"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:52:06 crc kubenswrapper[4704]: I1125 15:52:06.301911 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95561520-558e-4407-9c6c-f76abdf194a7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "95561520-558e-4407-9c6c-f76abdf194a7" (UID: "95561520-558e-4407-9c6c-f76abdf194a7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:52:06 crc kubenswrapper[4704]: I1125 15:52:06.301937 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95561520-558e-4407-9c6c-f76abdf194a7-kube-api-access-kpchx" (OuterVolumeSpecName: "kube-api-access-kpchx") pod "95561520-558e-4407-9c6c-f76abdf194a7" (UID: "95561520-558e-4407-9c6c-f76abdf194a7"). InnerVolumeSpecName "kube-api-access-kpchx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:52:06 crc kubenswrapper[4704]: I1125 15:52:06.305912 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95561520-558e-4407-9c6c-f76abdf194a7-scripts" (OuterVolumeSpecName: "scripts") pod "95561520-558e-4407-9c6c-f76abdf194a7" (UID: "95561520-558e-4407-9c6c-f76abdf194a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:52:06 crc kubenswrapper[4704]: I1125 15:52:06.321896 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95561520-558e-4407-9c6c-f76abdf194a7-config-data" (OuterVolumeSpecName: "config-data") pod "95561520-558e-4407-9c6c-f76abdf194a7" (UID: "95561520-558e-4407-9c6c-f76abdf194a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:52:06 crc kubenswrapper[4704]: I1125 15:52:06.396446 4704 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95561520-558e-4407-9c6c-f76abdf194a7-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:52:06 crc kubenswrapper[4704]: I1125 15:52:06.396486 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpchx\" (UniqueName: \"kubernetes.io/projected/95561520-558e-4407-9c6c-f76abdf194a7-kube-api-access-kpchx\") on node \"crc\" DevicePath \"\"" Nov 25 15:52:06 crc kubenswrapper[4704]: I1125 15:52:06.396496 4704 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95561520-558e-4407-9c6c-f76abdf194a7-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:52:06 crc kubenswrapper[4704]: I1125 15:52:06.396506 4704 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/95561520-558e-4407-9c6c-f76abdf194a7-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 25 15:52:06 crc kubenswrapper[4704]: I1125 15:52:06.396515 4704 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/95561520-558e-4407-9c6c-f76abdf194a7-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 25 15:52:06 crc kubenswrapper[4704]: I1125 15:52:06.825987 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-ftgpq" event={"ID":"95561520-558e-4407-9c6c-f76abdf194a7","Type":"ContainerDied","Data":"7dd15fc71ee7eddf50354d3402a111684801716ac9aad46248d450eb8cfb16a1"} Nov 25 15:52:06 crc kubenswrapper[4704]: I1125 15:52:06.826048 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-ftgpq" Nov 25 15:52:06 crc kubenswrapper[4704]: I1125 15:52:06.826052 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dd15fc71ee7eddf50354d3402a111684801716ac9aad46248d450eb8cfb16a1" Nov 25 15:52:06 crc kubenswrapper[4704]: I1125 15:52:06.893460 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-fc478b69-fc2jj"] Nov 25 15:52:06 crc kubenswrapper[4704]: E1125 15:52:06.893724 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95561520-558e-4407-9c6c-f76abdf194a7" containerName="keystone-bootstrap" Nov 25 15:52:06 crc kubenswrapper[4704]: I1125 15:52:06.893738 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="95561520-558e-4407-9c6c-f76abdf194a7" containerName="keystone-bootstrap" Nov 25 15:52:06 crc kubenswrapper[4704]: I1125 15:52:06.893859 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="95561520-558e-4407-9c6c-f76abdf194a7" containerName="keystone-bootstrap" Nov 25 15:52:06 crc kubenswrapper[4704]: I1125 15:52:06.894323 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-fc478b69-fc2jj" Nov 25 15:52:06 crc kubenswrapper[4704]: I1125 15:52:06.897446 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Nov 25 15:52:06 crc kubenswrapper[4704]: I1125 15:52:06.898511 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Nov 25 15:52:06 crc kubenswrapper[4704]: I1125 15:52:06.898599 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Nov 25 15:52:06 crc kubenswrapper[4704]: I1125 15:52:06.915579 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-fc478b69-fc2jj"] Nov 25 15:52:06 crc kubenswrapper[4704]: I1125 15:52:06.917366 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-78ztb" Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.003501 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/45254a49-d34b-464a-97ec-0b04cbd7c1fe-credential-keys\") pod \"keystone-fc478b69-fc2jj\" (UID: \"45254a49-d34b-464a-97ec-0b04cbd7c1fe\") " pod="glance-kuttl-tests/keystone-fc478b69-fc2jj" Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.003582 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/45254a49-d34b-464a-97ec-0b04cbd7c1fe-fernet-keys\") pod \"keystone-fc478b69-fc2jj\" (UID: \"45254a49-d34b-464a-97ec-0b04cbd7c1fe\") " pod="glance-kuttl-tests/keystone-fc478b69-fc2jj" Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.003622 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45254a49-d34b-464a-97ec-0b04cbd7c1fe-config-data\") pod \"keystone-fc478b69-fc2jj\" (UID: \"45254a49-d34b-464a-97ec-0b04cbd7c1fe\") " pod="glance-kuttl-tests/keystone-fc478b69-fc2jj" Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.003745 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcsr6\" (UniqueName: \"kubernetes.io/projected/45254a49-d34b-464a-97ec-0b04cbd7c1fe-kube-api-access-zcsr6\") pod \"keystone-fc478b69-fc2jj\" (UID: \"45254a49-d34b-464a-97ec-0b04cbd7c1fe\") " pod="glance-kuttl-tests/keystone-fc478b69-fc2jj" Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.003798 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45254a49-d34b-464a-97ec-0b04cbd7c1fe-scripts\") pod \"keystone-fc478b69-fc2jj\" (UID: \"45254a49-d34b-464a-97ec-0b04cbd7c1fe\") " pod="glance-kuttl-tests/keystone-fc478b69-fc2jj" Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.105200 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45254a49-d34b-464a-97ec-0b04cbd7c1fe-scripts\") pod \"keystone-fc478b69-fc2jj\" (UID: \"45254a49-d34b-464a-97ec-0b04cbd7c1fe\") " pod="glance-kuttl-tests/keystone-fc478b69-fc2jj" Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.105277 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/45254a49-d34b-464a-97ec-0b04cbd7c1fe-credential-keys\") pod \"keystone-fc478b69-fc2jj\" (UID: \"45254a49-d34b-464a-97ec-0b04cbd7c1fe\") " pod="glance-kuttl-tests/keystone-fc478b69-fc2jj" Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.105310 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/45254a49-d34b-464a-97ec-0b04cbd7c1fe-fernet-keys\") pod \"keystone-fc478b69-fc2jj\" (UID: \"45254a49-d34b-464a-97ec-0b04cbd7c1fe\") " pod="glance-kuttl-tests/keystone-fc478b69-fc2jj" Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.105336 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45254a49-d34b-464a-97ec-0b04cbd7c1fe-config-data\") pod \"keystone-fc478b69-fc2jj\" (UID: \"45254a49-d34b-464a-97ec-0b04cbd7c1fe\") " pod="glance-kuttl-tests/keystone-fc478b69-fc2jj" Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.106221 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcsr6\" (UniqueName: \"kubernetes.io/projected/45254a49-d34b-464a-97ec-0b04cbd7c1fe-kube-api-access-zcsr6\") pod \"keystone-fc478b69-fc2jj\" (UID: \"45254a49-d34b-464a-97ec-0b04cbd7c1fe\") " pod="glance-kuttl-tests/keystone-fc478b69-fc2jj" Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.110618 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/45254a49-d34b-464a-97ec-0b04cbd7c1fe-credential-keys\") pod \"keystone-fc478b69-fc2jj\" (UID: \"45254a49-d34b-464a-97ec-0b04cbd7c1fe\") " pod="glance-kuttl-tests/keystone-fc478b69-fc2jj" Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.111357 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45254a49-d34b-464a-97ec-0b04cbd7c1fe-config-data\") pod \"keystone-fc478b69-fc2jj\" (UID: \"45254a49-d34b-464a-97ec-0b04cbd7c1fe\") " pod="glance-kuttl-tests/keystone-fc478b69-fc2jj" Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.111622 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45254a49-d34b-464a-97ec-0b04cbd7c1fe-scripts\") pod \"keystone-fc478b69-fc2jj\" (UID: \"45254a49-d34b-464a-97ec-0b04cbd7c1fe\") " pod="glance-kuttl-tests/keystone-fc478b69-fc2jj" Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.112732 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/45254a49-d34b-464a-97ec-0b04cbd7c1fe-fernet-keys\") pod \"keystone-fc478b69-fc2jj\" (UID: \"45254a49-d34b-464a-97ec-0b04cbd7c1fe\") " pod="glance-kuttl-tests/keystone-fc478b69-fc2jj" Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.123141 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcsr6\" (UniqueName: \"kubernetes.io/projected/45254a49-d34b-464a-97ec-0b04cbd7c1fe-kube-api-access-zcsr6\") pod \"keystone-fc478b69-fc2jj\" (UID: \"45254a49-d34b-464a-97ec-0b04cbd7c1fe\") " pod="glance-kuttl-tests/keystone-fc478b69-fc2jj" Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.210838 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-fc478b69-fc2jj" Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.438897 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2"] Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.444781 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2" Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.448408 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-8zdtm" Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.450542 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2"] Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.612267 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8bbf\" (UniqueName: \"kubernetes.io/projected/58a8a704-aa96-4788-825e-a343803ac76b-kube-api-access-j8bbf\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2\" (UID: \"58a8a704-aa96-4788-825e-a343803ac76b\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2" Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.612336 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/58a8a704-aa96-4788-825e-a343803ac76b-bundle\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2\" (UID: \"58a8a704-aa96-4788-825e-a343803ac76b\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2" Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.612388 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/58a8a704-aa96-4788-825e-a343803ac76b-util\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2\" (UID: \"58a8a704-aa96-4788-825e-a343803ac76b\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2" Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.619906 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-fc478b69-fc2jj"] Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.713506 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/58a8a704-aa96-4788-825e-a343803ac76b-bundle\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2\" (UID: \"58a8a704-aa96-4788-825e-a343803ac76b\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2" Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.713595 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/58a8a704-aa96-4788-825e-a343803ac76b-util\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2\" (UID: \"58a8a704-aa96-4788-825e-a343803ac76b\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2" Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.713694 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8bbf\" (UniqueName: \"kubernetes.io/projected/58a8a704-aa96-4788-825e-a343803ac76b-kube-api-access-j8bbf\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2\" (UID: \"58a8a704-aa96-4788-825e-a343803ac76b\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2" Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.714294 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/58a8a704-aa96-4788-825e-a343803ac76b-bundle\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2\" (UID: \"58a8a704-aa96-4788-825e-a343803ac76b\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2" Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.714373 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/58a8a704-aa96-4788-825e-a343803ac76b-util\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2\" (UID: \"58a8a704-aa96-4788-825e-a343803ac76b\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2" Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.735579 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8bbf\" (UniqueName: \"kubernetes.io/projected/58a8a704-aa96-4788-825e-a343803ac76b-kube-api-access-j8bbf\") pod \"9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2\" (UID: \"58a8a704-aa96-4788-825e-a343803ac76b\") " pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2" Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.769805 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2" Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.834860 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-fc478b69-fc2jj" event={"ID":"45254a49-d34b-464a-97ec-0b04cbd7c1fe","Type":"ContainerStarted","Data":"3e744919ae69e6be185d29bff888a4984e538b08a7f6aae973cd5d86dbf0630d"} Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.964414 4704 patch_prober.go:28] interesting pod/machine-config-daemon-djz8x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:52:07 crc kubenswrapper[4704]: I1125 15:52:07.964468 4704 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:52:08 crc kubenswrapper[4704]: I1125 15:52:08.186538 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2"] Nov 25 15:52:08 crc kubenswrapper[4704]: W1125 15:52:08.197721 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58a8a704_aa96_4788_825e_a343803ac76b.slice/crio-01d9a7e832eb20a13ee2936a1d32c3b0d0f19053f2d26616f5acb85c8fd5e7c7 WatchSource:0}: Error finding container 01d9a7e832eb20a13ee2936a1d32c3b0d0f19053f2d26616f5acb85c8fd5e7c7: Status 404 returned error can't find the container with id 01d9a7e832eb20a13ee2936a1d32c3b0d0f19053f2d26616f5acb85c8fd5e7c7 Nov 25 15:52:08 crc kubenswrapper[4704]: I1125 15:52:08.263844 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k"] Nov 25 15:52:08 crc kubenswrapper[4704]: I1125 15:52:08.265314 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k" Nov 25 15:52:08 crc kubenswrapper[4704]: I1125 15:52:08.283212 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k"] Nov 25 15:52:08 crc kubenswrapper[4704]: I1125 15:52:08.421602 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89vhz\" (UniqueName: \"kubernetes.io/projected/68b8e4d2-c04c-470e-a4c8-debcf659c143-kube-api-access-89vhz\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k\" (UID: \"68b8e4d2-c04c-470e-a4c8-debcf659c143\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k" Nov 25 15:52:08 crc kubenswrapper[4704]: I1125 15:52:08.422081 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68b8e4d2-c04c-470e-a4c8-debcf659c143-util\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k\" (UID: \"68b8e4d2-c04c-470e-a4c8-debcf659c143\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k" Nov 25 15:52:08 crc kubenswrapper[4704]: I1125 15:52:08.422157 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68b8e4d2-c04c-470e-a4c8-debcf659c143-bundle\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k\" (UID: \"68b8e4d2-c04c-470e-a4c8-debcf659c143\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k" Nov 25 15:52:08 crc kubenswrapper[4704]: I1125 15:52:08.523685 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68b8e4d2-c04c-470e-a4c8-debcf659c143-bundle\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k\" (UID: \"68b8e4d2-c04c-470e-a4c8-debcf659c143\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k" Nov 25 15:52:08 crc kubenswrapper[4704]: I1125 15:52:08.523823 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89vhz\" (UniqueName: \"kubernetes.io/projected/68b8e4d2-c04c-470e-a4c8-debcf659c143-kube-api-access-89vhz\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k\" (UID: \"68b8e4d2-c04c-470e-a4c8-debcf659c143\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k" Nov 25 15:52:08 crc kubenswrapper[4704]: I1125 15:52:08.523857 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68b8e4d2-c04c-470e-a4c8-debcf659c143-util\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k\" (UID: \"68b8e4d2-c04c-470e-a4c8-debcf659c143\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k" Nov 25 15:52:08 crc kubenswrapper[4704]: I1125 15:52:08.524678 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68b8e4d2-c04c-470e-a4c8-debcf659c143-util\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k\" (UID: \"68b8e4d2-c04c-470e-a4c8-debcf659c143\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k" Nov 25 15:52:08 crc kubenswrapper[4704]: I1125 15:52:08.524952 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68b8e4d2-c04c-470e-a4c8-debcf659c143-bundle\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k\" (UID: \"68b8e4d2-c04c-470e-a4c8-debcf659c143\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k" Nov 25 15:52:08 crc kubenswrapper[4704]: I1125 15:52:08.547330 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89vhz\" (UniqueName: \"kubernetes.io/projected/68b8e4d2-c04c-470e-a4c8-debcf659c143-kube-api-access-89vhz\") pod \"87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k\" (UID: \"68b8e4d2-c04c-470e-a4c8-debcf659c143\") " pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k" Nov 25 15:52:08 crc kubenswrapper[4704]: I1125 15:52:08.605398 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k" Nov 25 15:52:08 crc kubenswrapper[4704]: I1125 15:52:08.856277 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-fc478b69-fc2jj" event={"ID":"45254a49-d34b-464a-97ec-0b04cbd7c1fe","Type":"ContainerStarted","Data":"104bf47dc47849c1c6eb663b384e58a55c656c6e39e87b9a9fb75090a8991c2a"} Nov 25 15:52:08 crc kubenswrapper[4704]: I1125 15:52:08.856354 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/keystone-fc478b69-fc2jj" Nov 25 15:52:08 crc kubenswrapper[4704]: I1125 15:52:08.859772 4704 generic.go:334] "Generic (PLEG): container finished" podID="58a8a704-aa96-4788-825e-a343803ac76b" containerID="ca00de47738f94ac7ad91b0394369476de776fc46be3d7c9012255a17977741d" exitCode=0 Nov 25 15:52:08 crc kubenswrapper[4704]: I1125 15:52:08.859868 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2" event={"ID":"58a8a704-aa96-4788-825e-a343803ac76b","Type":"ContainerDied","Data":"ca00de47738f94ac7ad91b0394369476de776fc46be3d7c9012255a17977741d"} Nov 25 15:52:08 crc kubenswrapper[4704]: I1125 15:52:08.859908 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2" event={"ID":"58a8a704-aa96-4788-825e-a343803ac76b","Type":"ContainerStarted","Data":"01d9a7e832eb20a13ee2936a1d32c3b0d0f19053f2d26616f5acb85c8fd5e7c7"} Nov 25 15:52:08 crc kubenswrapper[4704]: I1125 15:52:08.875746 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-fc478b69-fc2jj" podStartSLOduration=2.875723642 podStartE2EDuration="2.875723642s" podCreationTimestamp="2025-11-25 15:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:52:08.874696842 +0000 UTC m=+1015.142970623" watchObservedRunningTime="2025-11-25 15:52:08.875723642 +0000 UTC m=+1015.143997433" Nov 25 15:52:09 crc kubenswrapper[4704]: I1125 15:52:09.064879 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k"] Nov 25 15:52:09 crc kubenswrapper[4704]: W1125 15:52:09.073298 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68b8e4d2_c04c_470e_a4c8_debcf659c143.slice/crio-722489f3376967e8b869f3c1bb59116b2a08209563446304a98d595b8d0faaf7 WatchSource:0}: Error finding container 722489f3376967e8b869f3c1bb59116b2a08209563446304a98d595b8d0faaf7: Status 404 returned error can't find the container with id 722489f3376967e8b869f3c1bb59116b2a08209563446304a98d595b8d0faaf7 Nov 25 15:52:09 crc kubenswrapper[4704]: I1125 15:52:09.867600 4704 generic.go:334] "Generic (PLEG): container finished" podID="58a8a704-aa96-4788-825e-a343803ac76b" containerID="82e9b8a2115262887e0c925a5f4f8d4d07a87de9cb834ec2bc513c516285a6f4" exitCode=0 Nov 25 15:52:09 crc kubenswrapper[4704]: I1125 15:52:09.867658 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2" event={"ID":"58a8a704-aa96-4788-825e-a343803ac76b","Type":"ContainerDied","Data":"82e9b8a2115262887e0c925a5f4f8d4d07a87de9cb834ec2bc513c516285a6f4"} Nov 25 15:52:09 crc kubenswrapper[4704]: I1125 15:52:09.869752 4704 generic.go:334] "Generic (PLEG): container finished" podID="68b8e4d2-c04c-470e-a4c8-debcf659c143" containerID="0988b617dd98e9f76bbeb2aca82646003fbc9b1b05e5e406f242f15a3eac3e37" exitCode=0 Nov 25 15:52:09 crc kubenswrapper[4704]: I1125 15:52:09.869806 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k" event={"ID":"68b8e4d2-c04c-470e-a4c8-debcf659c143","Type":"ContainerDied","Data":"0988b617dd98e9f76bbeb2aca82646003fbc9b1b05e5e406f242f15a3eac3e37"} Nov 25 15:52:09 crc kubenswrapper[4704]: I1125 15:52:09.869865 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k" event={"ID":"68b8e4d2-c04c-470e-a4c8-debcf659c143","Type":"ContainerStarted","Data":"722489f3376967e8b869f3c1bb59116b2a08209563446304a98d595b8d0faaf7"} Nov 25 15:52:10 crc kubenswrapper[4704]: I1125 15:52:10.884841 4704 generic.go:334] "Generic (PLEG): container finished" podID="58a8a704-aa96-4788-825e-a343803ac76b" containerID="fc1c25956f154cbb51f38349aed2401292087639d46ba20292e4a1d482d6e116" exitCode=0 Nov 25 15:52:10 crc kubenswrapper[4704]: I1125 15:52:10.885013 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2" event={"ID":"58a8a704-aa96-4788-825e-a343803ac76b","Type":"ContainerDied","Data":"fc1c25956f154cbb51f38349aed2401292087639d46ba20292e4a1d482d6e116"} Nov 25 15:52:11 crc kubenswrapper[4704]: I1125 15:52:11.895325 4704 generic.go:334] "Generic (PLEG): container finished" podID="68b8e4d2-c04c-470e-a4c8-debcf659c143" containerID="faf221da1e7fbcd421ce0ff17d5f415eec9094d53e3ced25548ec417cdc6de15" exitCode=0 Nov 25 15:52:11 crc kubenswrapper[4704]: I1125 15:52:11.895445 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k" event={"ID":"68b8e4d2-c04c-470e-a4c8-debcf659c143","Type":"ContainerDied","Data":"faf221da1e7fbcd421ce0ff17d5f415eec9094d53e3ced25548ec417cdc6de15"} Nov 25 15:52:12 crc kubenswrapper[4704]: I1125 15:52:12.207182 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2" Nov 25 15:52:12 crc kubenswrapper[4704]: I1125 15:52:12.299417 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/58a8a704-aa96-4788-825e-a343803ac76b-util\") pod \"58a8a704-aa96-4788-825e-a343803ac76b\" (UID: \"58a8a704-aa96-4788-825e-a343803ac76b\") " Nov 25 15:52:12 crc kubenswrapper[4704]: I1125 15:52:12.299480 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/58a8a704-aa96-4788-825e-a343803ac76b-bundle\") pod \"58a8a704-aa96-4788-825e-a343803ac76b\" (UID: \"58a8a704-aa96-4788-825e-a343803ac76b\") " Nov 25 15:52:12 crc kubenswrapper[4704]: I1125 15:52:12.299534 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8bbf\" (UniqueName: \"kubernetes.io/projected/58a8a704-aa96-4788-825e-a343803ac76b-kube-api-access-j8bbf\") pod \"58a8a704-aa96-4788-825e-a343803ac76b\" (UID: \"58a8a704-aa96-4788-825e-a343803ac76b\") " Nov 25 15:52:12 crc kubenswrapper[4704]: I1125 15:52:12.300333 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58a8a704-aa96-4788-825e-a343803ac76b-bundle" (OuterVolumeSpecName: "bundle") pod "58a8a704-aa96-4788-825e-a343803ac76b" (UID: "58a8a704-aa96-4788-825e-a343803ac76b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:52:12 crc kubenswrapper[4704]: I1125 15:52:12.306130 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58a8a704-aa96-4788-825e-a343803ac76b-kube-api-access-j8bbf" (OuterVolumeSpecName: "kube-api-access-j8bbf") pod "58a8a704-aa96-4788-825e-a343803ac76b" (UID: "58a8a704-aa96-4788-825e-a343803ac76b"). InnerVolumeSpecName "kube-api-access-j8bbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:52:12 crc kubenswrapper[4704]: I1125 15:52:12.312574 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58a8a704-aa96-4788-825e-a343803ac76b-util" (OuterVolumeSpecName: "util") pod "58a8a704-aa96-4788-825e-a343803ac76b" (UID: "58a8a704-aa96-4788-825e-a343803ac76b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:52:12 crc kubenswrapper[4704]: I1125 15:52:12.400985 4704 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/58a8a704-aa96-4788-825e-a343803ac76b-util\") on node \"crc\" DevicePath \"\"" Nov 25 15:52:12 crc kubenswrapper[4704]: I1125 15:52:12.401025 4704 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/58a8a704-aa96-4788-825e-a343803ac76b-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:52:12 crc kubenswrapper[4704]: I1125 15:52:12.401035 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8bbf\" (UniqueName: \"kubernetes.io/projected/58a8a704-aa96-4788-825e-a343803ac76b-kube-api-access-j8bbf\") on node \"crc\" DevicePath \"\"" Nov 25 15:52:12 crc kubenswrapper[4704]: I1125 15:52:12.906041 4704 generic.go:334] "Generic (PLEG): container finished" podID="68b8e4d2-c04c-470e-a4c8-debcf659c143" containerID="930dac4554f680fd19c97490dc9788e2fe2fbeddf08ff1ef4b24c966632eca58" exitCode=0 Nov 25 15:52:12 crc kubenswrapper[4704]: I1125 15:52:12.906130 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k" event={"ID":"68b8e4d2-c04c-470e-a4c8-debcf659c143","Type":"ContainerDied","Data":"930dac4554f680fd19c97490dc9788e2fe2fbeddf08ff1ef4b24c966632eca58"} Nov 25 15:52:12 crc kubenswrapper[4704]: I1125 15:52:12.909663 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2" event={"ID":"58a8a704-aa96-4788-825e-a343803ac76b","Type":"ContainerDied","Data":"01d9a7e832eb20a13ee2936a1d32c3b0d0f19053f2d26616f5acb85c8fd5e7c7"} Nov 25 15:52:12 crc kubenswrapper[4704]: I1125 15:52:12.909699 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01d9a7e832eb20a13ee2936a1d32c3b0d0f19053f2d26616f5acb85c8fd5e7c7" Nov 25 15:52:12 crc kubenswrapper[4704]: I1125 15:52:12.909741 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2" Nov 25 15:52:15 crc kubenswrapper[4704]: I1125 15:52:15.136291 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k" Nov 25 15:52:15 crc kubenswrapper[4704]: I1125 15:52:15.241433 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89vhz\" (UniqueName: \"kubernetes.io/projected/68b8e4d2-c04c-470e-a4c8-debcf659c143-kube-api-access-89vhz\") pod \"68b8e4d2-c04c-470e-a4c8-debcf659c143\" (UID: \"68b8e4d2-c04c-470e-a4c8-debcf659c143\") " Nov 25 15:52:15 crc kubenswrapper[4704]: I1125 15:52:15.241523 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68b8e4d2-c04c-470e-a4c8-debcf659c143-bundle\") pod \"68b8e4d2-c04c-470e-a4c8-debcf659c143\" (UID: \"68b8e4d2-c04c-470e-a4c8-debcf659c143\") " Nov 25 15:52:15 crc kubenswrapper[4704]: I1125 15:52:15.241559 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68b8e4d2-c04c-470e-a4c8-debcf659c143-util\") pod \"68b8e4d2-c04c-470e-a4c8-debcf659c143\" (UID: \"68b8e4d2-c04c-470e-a4c8-debcf659c143\") " Nov 25 15:52:15 crc kubenswrapper[4704]: I1125 15:52:15.243004 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68b8e4d2-c04c-470e-a4c8-debcf659c143-bundle" (OuterVolumeSpecName: "bundle") pod "68b8e4d2-c04c-470e-a4c8-debcf659c143" (UID: "68b8e4d2-c04c-470e-a4c8-debcf659c143"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:52:15 crc kubenswrapper[4704]: I1125 15:52:15.249352 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68b8e4d2-c04c-470e-a4c8-debcf659c143-kube-api-access-89vhz" (OuterVolumeSpecName: "kube-api-access-89vhz") pod "68b8e4d2-c04c-470e-a4c8-debcf659c143" (UID: "68b8e4d2-c04c-470e-a4c8-debcf659c143"). InnerVolumeSpecName "kube-api-access-89vhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:52:15 crc kubenswrapper[4704]: I1125 15:52:15.309596 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68b8e4d2-c04c-470e-a4c8-debcf659c143-util" (OuterVolumeSpecName: "util") pod "68b8e4d2-c04c-470e-a4c8-debcf659c143" (UID: "68b8e4d2-c04c-470e-a4c8-debcf659c143"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:52:15 crc kubenswrapper[4704]: I1125 15:52:15.343479 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89vhz\" (UniqueName: \"kubernetes.io/projected/68b8e4d2-c04c-470e-a4c8-debcf659c143-kube-api-access-89vhz\") on node \"crc\" DevicePath \"\"" Nov 25 15:52:15 crc kubenswrapper[4704]: I1125 15:52:15.343517 4704 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68b8e4d2-c04c-470e-a4c8-debcf659c143-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:52:15 crc kubenswrapper[4704]: I1125 15:52:15.343526 4704 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68b8e4d2-c04c-470e-a4c8-debcf659c143-util\") on node \"crc\" DevicePath \"\"" Nov 25 15:52:15 crc kubenswrapper[4704]: I1125 15:52:15.943684 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k" event={"ID":"68b8e4d2-c04c-470e-a4c8-debcf659c143","Type":"ContainerDied","Data":"722489f3376967e8b869f3c1bb59116b2a08209563446304a98d595b8d0faaf7"} Nov 25 15:52:15 crc kubenswrapper[4704]: I1125 15:52:15.944199 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="722489f3376967e8b869f3c1bb59116b2a08209563446304a98d595b8d0faaf7" Nov 25 15:52:15 crc kubenswrapper[4704]: I1125 15:52:15.943743 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k" Nov 25 15:52:25 crc kubenswrapper[4704]: I1125 15:52:25.897453 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6d895d4c49-gsd7z"] Nov 25 15:52:25 crc kubenswrapper[4704]: E1125 15:52:25.898758 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68b8e4d2-c04c-470e-a4c8-debcf659c143" containerName="extract" Nov 25 15:52:25 crc kubenswrapper[4704]: I1125 15:52:25.898777 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b8e4d2-c04c-470e-a4c8-debcf659c143" containerName="extract" Nov 25 15:52:25 crc kubenswrapper[4704]: E1125 15:52:25.898813 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68b8e4d2-c04c-470e-a4c8-debcf659c143" containerName="util" Nov 25 15:52:25 crc kubenswrapper[4704]: I1125 15:52:25.898822 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b8e4d2-c04c-470e-a4c8-debcf659c143" containerName="util" Nov 25 15:52:25 crc kubenswrapper[4704]: E1125 15:52:25.898837 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58a8a704-aa96-4788-825e-a343803ac76b" containerName="extract" Nov 25 15:52:25 crc kubenswrapper[4704]: I1125 15:52:25.898846 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="58a8a704-aa96-4788-825e-a343803ac76b" containerName="extract" Nov 25 15:52:25 crc kubenswrapper[4704]: E1125 15:52:25.898867 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58a8a704-aa96-4788-825e-a343803ac76b" containerName="util" Nov 25 15:52:25 crc kubenswrapper[4704]: I1125 15:52:25.898874 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="58a8a704-aa96-4788-825e-a343803ac76b" containerName="util" Nov 25 15:52:25 crc kubenswrapper[4704]: E1125 15:52:25.898892 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68b8e4d2-c04c-470e-a4c8-debcf659c143" containerName="pull" Nov 25 15:52:25 crc kubenswrapper[4704]: I1125 15:52:25.898900 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b8e4d2-c04c-470e-a4c8-debcf659c143" containerName="pull" Nov 25 15:52:25 crc kubenswrapper[4704]: E1125 15:52:25.898910 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58a8a704-aa96-4788-825e-a343803ac76b" containerName="pull" Nov 25 15:52:25 crc kubenswrapper[4704]: I1125 15:52:25.898920 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="58a8a704-aa96-4788-825e-a343803ac76b" containerName="pull" Nov 25 15:52:25 crc kubenswrapper[4704]: I1125 15:52:25.899057 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="68b8e4d2-c04c-470e-a4c8-debcf659c143" containerName="extract" Nov 25 15:52:25 crc kubenswrapper[4704]: I1125 15:52:25.899080 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="58a8a704-aa96-4788-825e-a343803ac76b" containerName="extract" Nov 25 15:52:25 crc kubenswrapper[4704]: I1125 15:52:25.899673 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6d895d4c49-gsd7z" Nov 25 15:52:25 crc kubenswrapper[4704]: I1125 15:52:25.902779 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-pzv65" Nov 25 15:52:25 crc kubenswrapper[4704]: I1125 15:52:25.903854 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-service-cert" Nov 25 15:52:25 crc kubenswrapper[4704]: I1125 15:52:25.910473 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6d895d4c49-gsd7z"] Nov 25 15:52:25 crc kubenswrapper[4704]: I1125 15:52:25.982912 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3af84a01-fe4f-44f8-88a2-e68ae8855933-webhook-cert\") pod \"swift-operator-controller-manager-6d895d4c49-gsd7z\" (UID: \"3af84a01-fe4f-44f8-88a2-e68ae8855933\") " pod="openstack-operators/swift-operator-controller-manager-6d895d4c49-gsd7z" Nov 25 15:52:25 crc kubenswrapper[4704]: I1125 15:52:25.983002 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh2jb\" (UniqueName: \"kubernetes.io/projected/3af84a01-fe4f-44f8-88a2-e68ae8855933-kube-api-access-gh2jb\") pod \"swift-operator-controller-manager-6d895d4c49-gsd7z\" (UID: \"3af84a01-fe4f-44f8-88a2-e68ae8855933\") " pod="openstack-operators/swift-operator-controller-manager-6d895d4c49-gsd7z" Nov 25 15:52:25 crc kubenswrapper[4704]: I1125 15:52:25.983262 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3af84a01-fe4f-44f8-88a2-e68ae8855933-apiservice-cert\") pod \"swift-operator-controller-manager-6d895d4c49-gsd7z\" (UID: \"3af84a01-fe4f-44f8-88a2-e68ae8855933\") " pod="openstack-operators/swift-operator-controller-manager-6d895d4c49-gsd7z" Nov 25 15:52:26 crc kubenswrapper[4704]: I1125 15:52:26.084456 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3af84a01-fe4f-44f8-88a2-e68ae8855933-webhook-cert\") pod \"swift-operator-controller-manager-6d895d4c49-gsd7z\" (UID: \"3af84a01-fe4f-44f8-88a2-e68ae8855933\") " pod="openstack-operators/swift-operator-controller-manager-6d895d4c49-gsd7z" Nov 25 15:52:26 crc kubenswrapper[4704]: I1125 15:52:26.084542 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh2jb\" (UniqueName: \"kubernetes.io/projected/3af84a01-fe4f-44f8-88a2-e68ae8855933-kube-api-access-gh2jb\") pod \"swift-operator-controller-manager-6d895d4c49-gsd7z\" (UID: \"3af84a01-fe4f-44f8-88a2-e68ae8855933\") " pod="openstack-operators/swift-operator-controller-manager-6d895d4c49-gsd7z" Nov 25 15:52:26 crc kubenswrapper[4704]: I1125 15:52:26.084606 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3af84a01-fe4f-44f8-88a2-e68ae8855933-apiservice-cert\") pod \"swift-operator-controller-manager-6d895d4c49-gsd7z\" (UID: \"3af84a01-fe4f-44f8-88a2-e68ae8855933\") " pod="openstack-operators/swift-operator-controller-manager-6d895d4c49-gsd7z" Nov 25 15:52:26 crc kubenswrapper[4704]: I1125 15:52:26.093762 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3af84a01-fe4f-44f8-88a2-e68ae8855933-apiservice-cert\") pod \"swift-operator-controller-manager-6d895d4c49-gsd7z\" (UID: \"3af84a01-fe4f-44f8-88a2-e68ae8855933\") " pod="openstack-operators/swift-operator-controller-manager-6d895d4c49-gsd7z" Nov 25 15:52:26 crc kubenswrapper[4704]: I1125 15:52:26.105296 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3af84a01-fe4f-44f8-88a2-e68ae8855933-webhook-cert\") pod \"swift-operator-controller-manager-6d895d4c49-gsd7z\" (UID: \"3af84a01-fe4f-44f8-88a2-e68ae8855933\") " pod="openstack-operators/swift-operator-controller-manager-6d895d4c49-gsd7z" Nov 25 15:52:26 crc kubenswrapper[4704]: I1125 15:52:26.106718 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh2jb\" (UniqueName: \"kubernetes.io/projected/3af84a01-fe4f-44f8-88a2-e68ae8855933-kube-api-access-gh2jb\") pod \"swift-operator-controller-manager-6d895d4c49-gsd7z\" (UID: \"3af84a01-fe4f-44f8-88a2-e68ae8855933\") " pod="openstack-operators/swift-operator-controller-manager-6d895d4c49-gsd7z" Nov 25 15:52:26 crc kubenswrapper[4704]: I1125 15:52:26.228505 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6d895d4c49-gsd7z" Nov 25 15:52:26 crc kubenswrapper[4704]: I1125 15:52:26.658166 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6d895d4c49-gsd7z"] Nov 25 15:52:26 crc kubenswrapper[4704]: W1125 15:52:26.668679 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3af84a01_fe4f_44f8_88a2_e68ae8855933.slice/crio-0b6d61db40abc907acd39f0cffa4dcf1663c536d83880d828e504c35cc473939 WatchSource:0}: Error finding container 0b6d61db40abc907acd39f0cffa4dcf1663c536d83880d828e504c35cc473939: Status 404 returned error can't find the container with id 0b6d61db40abc907acd39f0cffa4dcf1663c536d83880d828e504c35cc473939 Nov 25 15:52:26 crc kubenswrapper[4704]: I1125 15:52:26.673291 4704 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 15:52:27 crc kubenswrapper[4704]: I1125 15:52:27.019346 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6d895d4c49-gsd7z" event={"ID":"3af84a01-fe4f-44f8-88a2-e68ae8855933","Type":"ContainerStarted","Data":"0b6d61db40abc907acd39f0cffa4dcf1663c536d83880d828e504c35cc473939"} Nov 25 15:52:29 crc kubenswrapper[4704]: I1125 15:52:29.037505 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6d895d4c49-gsd7z" event={"ID":"3af84a01-fe4f-44f8-88a2-e68ae8855933","Type":"ContainerStarted","Data":"85051d227bc0ec1ded5667f39fcf5e803549abf7ca9befe4648fdd392a48a46d"} Nov 25 15:52:29 crc kubenswrapper[4704]: I1125 15:52:29.038073 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6d895d4c49-gsd7z" Nov 25 15:52:29 crc kubenswrapper[4704]: I1125 15:52:29.063882 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6d895d4c49-gsd7z" podStartSLOduration=2.024822846 podStartE2EDuration="4.063862012s" podCreationTimestamp="2025-11-25 15:52:25 +0000 UTC" firstStartedPulling="2025-11-25 15:52:26.673056744 +0000 UTC m=+1032.941330515" lastFinishedPulling="2025-11-25 15:52:28.7120959 +0000 UTC m=+1034.980369681" observedRunningTime="2025-11-25 15:52:29.057385774 +0000 UTC m=+1035.325659545" watchObservedRunningTime="2025-11-25 15:52:29.063862012 +0000 UTC m=+1035.332135793" Nov 25 15:52:31 crc kubenswrapper[4704]: I1125 15:52:31.694324 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5d4c77ff9c-lgm4t"] Nov 25 15:52:31 crc kubenswrapper[4704]: I1125 15:52:31.696236 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5d4c77ff9c-lgm4t" Nov 25 15:52:31 crc kubenswrapper[4704]: I1125 15:52:31.700068 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-service-cert" Nov 25 15:52:31 crc kubenswrapper[4704]: I1125 15:52:31.700358 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-dmzc2" Nov 25 15:52:31 crc kubenswrapper[4704]: I1125 15:52:31.739398 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5d4c77ff9c-lgm4t"] Nov 25 15:52:31 crc kubenswrapper[4704]: I1125 15:52:31.874942 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/13eebb33-6991-4747-883d-c13102f4a922-webhook-cert\") pod \"horizon-operator-controller-manager-5d4c77ff9c-lgm4t\" (UID: \"13eebb33-6991-4747-883d-c13102f4a922\") " pod="openstack-operators/horizon-operator-controller-manager-5d4c77ff9c-lgm4t" Nov 25 15:52:31 crc kubenswrapper[4704]: I1125 15:52:31.875023 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/13eebb33-6991-4747-883d-c13102f4a922-apiservice-cert\") pod \"horizon-operator-controller-manager-5d4c77ff9c-lgm4t\" (UID: \"13eebb33-6991-4747-883d-c13102f4a922\") " pod="openstack-operators/horizon-operator-controller-manager-5d4c77ff9c-lgm4t" Nov 25 15:52:31 crc kubenswrapper[4704]: I1125 15:52:31.875070 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bzjp\" (UniqueName: \"kubernetes.io/projected/13eebb33-6991-4747-883d-c13102f4a922-kube-api-access-7bzjp\") pod \"horizon-operator-controller-manager-5d4c77ff9c-lgm4t\" (UID: \"13eebb33-6991-4747-883d-c13102f4a922\") " pod="openstack-operators/horizon-operator-controller-manager-5d4c77ff9c-lgm4t" Nov 25 15:52:31 crc kubenswrapper[4704]: I1125 15:52:31.976999 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/13eebb33-6991-4747-883d-c13102f4a922-webhook-cert\") pod \"horizon-operator-controller-manager-5d4c77ff9c-lgm4t\" (UID: \"13eebb33-6991-4747-883d-c13102f4a922\") " pod="openstack-operators/horizon-operator-controller-manager-5d4c77ff9c-lgm4t" Nov 25 15:52:31 crc kubenswrapper[4704]: I1125 15:52:31.977060 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/13eebb33-6991-4747-883d-c13102f4a922-apiservice-cert\") pod \"horizon-operator-controller-manager-5d4c77ff9c-lgm4t\" (UID: \"13eebb33-6991-4747-883d-c13102f4a922\") " pod="openstack-operators/horizon-operator-controller-manager-5d4c77ff9c-lgm4t" Nov 25 15:52:31 crc kubenswrapper[4704]: I1125 15:52:31.977109 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bzjp\" (UniqueName: \"kubernetes.io/projected/13eebb33-6991-4747-883d-c13102f4a922-kube-api-access-7bzjp\") pod \"horizon-operator-controller-manager-5d4c77ff9c-lgm4t\" (UID: \"13eebb33-6991-4747-883d-c13102f4a922\") " pod="openstack-operators/horizon-operator-controller-manager-5d4c77ff9c-lgm4t" Nov 25 15:52:31 crc kubenswrapper[4704]: I1125 15:52:31.985758 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/13eebb33-6991-4747-883d-c13102f4a922-apiservice-cert\") pod \"horizon-operator-controller-manager-5d4c77ff9c-lgm4t\" (UID: \"13eebb33-6991-4747-883d-c13102f4a922\") " pod="openstack-operators/horizon-operator-controller-manager-5d4c77ff9c-lgm4t" Nov 25 15:52:31 crc kubenswrapper[4704]: I1125 15:52:31.985853 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/13eebb33-6991-4747-883d-c13102f4a922-webhook-cert\") pod \"horizon-operator-controller-manager-5d4c77ff9c-lgm4t\" (UID: \"13eebb33-6991-4747-883d-c13102f4a922\") " pod="openstack-operators/horizon-operator-controller-manager-5d4c77ff9c-lgm4t" Nov 25 15:52:31 crc kubenswrapper[4704]: I1125 15:52:31.996028 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bzjp\" (UniqueName: \"kubernetes.io/projected/13eebb33-6991-4747-883d-c13102f4a922-kube-api-access-7bzjp\") pod \"horizon-operator-controller-manager-5d4c77ff9c-lgm4t\" (UID: \"13eebb33-6991-4747-883d-c13102f4a922\") " pod="openstack-operators/horizon-operator-controller-manager-5d4c77ff9c-lgm4t" Nov 25 15:52:32 crc kubenswrapper[4704]: I1125 15:52:32.016820 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5d4c77ff9c-lgm4t" Nov 25 15:52:32 crc kubenswrapper[4704]: I1125 15:52:32.300918 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5d4c77ff9c-lgm4t"] Nov 25 15:52:33 crc kubenswrapper[4704]: I1125 15:52:33.084151 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5d4c77ff9c-lgm4t" event={"ID":"13eebb33-6991-4747-883d-c13102f4a922","Type":"ContainerStarted","Data":"39f6f5a0ea6d0e03ed2b5a46b1650620200f85ff1548c397fa4a319c71a3c3d9"} Nov 25 15:52:35 crc kubenswrapper[4704]: I1125 15:52:35.099990 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5d4c77ff9c-lgm4t" event={"ID":"13eebb33-6991-4747-883d-c13102f4a922","Type":"ContainerStarted","Data":"89af23900944103e39f42a12a99bac73b15108f079ace93a7722d58ed6a188f2"} Nov 25 15:52:35 crc kubenswrapper[4704]: I1125 15:52:35.100531 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5d4c77ff9c-lgm4t" Nov 25 15:52:35 crc kubenswrapper[4704]: I1125 15:52:35.117714 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5d4c77ff9c-lgm4t" podStartSLOduration=1.855984287 podStartE2EDuration="4.117696159s" podCreationTimestamp="2025-11-25 15:52:31 +0000 UTC" firstStartedPulling="2025-11-25 15:52:32.317444711 +0000 UTC m=+1038.585718492" lastFinishedPulling="2025-11-25 15:52:34.579156583 +0000 UTC m=+1040.847430364" observedRunningTime="2025-11-25 15:52:35.117418121 +0000 UTC m=+1041.385691912" watchObservedRunningTime="2025-11-25 15:52:35.117696159 +0000 UTC m=+1041.385969940" Nov 25 15:52:36 crc kubenswrapper[4704]: I1125 15:52:36.233287 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6d895d4c49-gsd7z" Nov 25 15:52:37 crc kubenswrapper[4704]: I1125 15:52:37.964728 4704 patch_prober.go:28] interesting pod/machine-config-daemon-djz8x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:52:37 crc kubenswrapper[4704]: I1125 15:52:37.965375 4704 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:52:39 crc kubenswrapper[4704]: I1125 15:52:39.189879 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/keystone-fc478b69-fc2jj" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.108241 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.113538 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.116354 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-files" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.116434 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-swift-dockercfg-lnpbl" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.116620 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-storage-config-data" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.117277 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-conf" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.141590 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.200150 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"ace2c1c5-31ae-43db-891a-6a587176c215\") " pod="glance-kuttl-tests/swift-storage-0" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.200199 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ace2c1c5-31ae-43db-891a-6a587176c215-lock\") pod \"swift-storage-0\" (UID: \"ace2c1c5-31ae-43db-891a-6a587176c215\") " pod="glance-kuttl-tests/swift-storage-0" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.200253 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlbpw\" (UniqueName: \"kubernetes.io/projected/ace2c1c5-31ae-43db-891a-6a587176c215-kube-api-access-tlbpw\") pod \"swift-storage-0\" (UID: \"ace2c1c5-31ae-43db-891a-6a587176c215\") " pod="glance-kuttl-tests/swift-storage-0" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.200292 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ace2c1c5-31ae-43db-891a-6a587176c215-etc-swift\") pod \"swift-storage-0\" (UID: \"ace2c1c5-31ae-43db-891a-6a587176c215\") " pod="glance-kuttl-tests/swift-storage-0" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.200346 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ace2c1c5-31ae-43db-891a-6a587176c215-cache\") pod \"swift-storage-0\" (UID: \"ace2c1c5-31ae-43db-891a-6a587176c215\") " pod="glance-kuttl-tests/swift-storage-0" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.301664 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ace2c1c5-31ae-43db-891a-6a587176c215-etc-swift\") pod \"swift-storage-0\" (UID: \"ace2c1c5-31ae-43db-891a-6a587176c215\") " pod="glance-kuttl-tests/swift-storage-0" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.301747 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ace2c1c5-31ae-43db-891a-6a587176c215-cache\") pod \"swift-storage-0\" (UID: \"ace2c1c5-31ae-43db-891a-6a587176c215\") " pod="glance-kuttl-tests/swift-storage-0" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.301771 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"ace2c1c5-31ae-43db-891a-6a587176c215\") " pod="glance-kuttl-tests/swift-storage-0" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.301805 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ace2c1c5-31ae-43db-891a-6a587176c215-lock\") pod \"swift-storage-0\" (UID: \"ace2c1c5-31ae-43db-891a-6a587176c215\") " pod="glance-kuttl-tests/swift-storage-0" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.301848 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlbpw\" (UniqueName: \"kubernetes.io/projected/ace2c1c5-31ae-43db-891a-6a587176c215-kube-api-access-tlbpw\") pod \"swift-storage-0\" (UID: \"ace2c1c5-31ae-43db-891a-6a587176c215\") " pod="glance-kuttl-tests/swift-storage-0" Nov 25 15:52:41 crc kubenswrapper[4704]: E1125 15:52:41.301894 4704 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 25 15:52:41 crc kubenswrapper[4704]: E1125 15:52:41.301936 4704 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Nov 25 15:52:41 crc kubenswrapper[4704]: E1125 15:52:41.302003 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ace2c1c5-31ae-43db-891a-6a587176c215-etc-swift podName:ace2c1c5-31ae-43db-891a-6a587176c215 nodeName:}" failed. No retries permitted until 2025-11-25 15:52:41.801979392 +0000 UTC m=+1048.070253353 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ace2c1c5-31ae-43db-891a-6a587176c215-etc-swift") pod "swift-storage-0" (UID: "ace2c1c5-31ae-43db-891a-6a587176c215") : configmap "swift-ring-files" not found Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.302242 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ace2c1c5-31ae-43db-891a-6a587176c215-cache\") pod \"swift-storage-0\" (UID: \"ace2c1c5-31ae-43db-891a-6a587176c215\") " pod="glance-kuttl-tests/swift-storage-0" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.302385 4704 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"ace2c1c5-31ae-43db-891a-6a587176c215\") device mount path \"/mnt/openstack/pv04\"" pod="glance-kuttl-tests/swift-storage-0" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.302465 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ace2c1c5-31ae-43db-891a-6a587176c215-lock\") pod \"swift-storage-0\" (UID: \"ace2c1c5-31ae-43db-891a-6a587176c215\") " pod="glance-kuttl-tests/swift-storage-0" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.322225 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"ace2c1c5-31ae-43db-891a-6a587176c215\") " pod="glance-kuttl-tests/swift-storage-0" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.334677 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlbpw\" (UniqueName: \"kubernetes.io/projected/ace2c1c5-31ae-43db-891a-6a587176c215-kube-api-access-tlbpw\") pod \"swift-storage-0\" (UID: \"ace2c1c5-31ae-43db-891a-6a587176c215\") " pod="glance-kuttl-tests/swift-storage-0" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.669057 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-2tq6c"] Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.670416 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-2tq6c" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.673396 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-proxy-config-data" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.673714 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-config-data" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.674042 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-scripts" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.688513 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-2tq6c"] Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.709129 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-2tq6c"] Nov 25 15:52:41 crc kubenswrapper[4704]: E1125 15:52:41.739821 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[dispersionconf etc-swift kube-api-access-c7v7l ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[dispersionconf etc-swift kube-api-access-c7v7l ring-data-devices scripts swiftconf]: context canceled" pod="glance-kuttl-tests/swift-ring-rebalance-2tq6c" podUID="0a24ed77-9712-40ce-958a-d8aa03c0ba2f" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.780063 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-m7p8n"] Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.781167 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-m7p8n" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.790901 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-m7p8n"] Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.834110 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-swiftconf\") pod \"swift-ring-rebalance-2tq6c\" (UID: \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\") " pod="glance-kuttl-tests/swift-ring-rebalance-2tq6c" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.834192 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-etc-swift\") pod \"swift-ring-rebalance-2tq6c\" (UID: \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\") " pod="glance-kuttl-tests/swift-ring-rebalance-2tq6c" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.834345 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ace2c1c5-31ae-43db-891a-6a587176c215-etc-swift\") pod \"swift-storage-0\" (UID: \"ace2c1c5-31ae-43db-891a-6a587176c215\") " pod="glance-kuttl-tests/swift-storage-0" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.834459 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-dispersionconf\") pod \"swift-ring-rebalance-2tq6c\" (UID: \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\") " pod="glance-kuttl-tests/swift-ring-rebalance-2tq6c" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.834496 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-ring-data-devices\") pod \"swift-ring-rebalance-2tq6c\" (UID: \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\") " pod="glance-kuttl-tests/swift-ring-rebalance-2tq6c" Nov 25 15:52:41 crc kubenswrapper[4704]: E1125 15:52:41.834541 4704 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.834554 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-scripts\") pod \"swift-ring-rebalance-2tq6c\" (UID: \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\") " pod="glance-kuttl-tests/swift-ring-rebalance-2tq6c" Nov 25 15:52:41 crc kubenswrapper[4704]: E1125 15:52:41.834561 4704 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Nov 25 15:52:41 crc kubenswrapper[4704]: E1125 15:52:41.834684 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ace2c1c5-31ae-43db-891a-6a587176c215-etc-swift podName:ace2c1c5-31ae-43db-891a-6a587176c215 nodeName:}" failed. No retries permitted until 2025-11-25 15:52:42.8346514 +0000 UTC m=+1049.102925181 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ace2c1c5-31ae-43db-891a-6a587176c215-etc-swift") pod "swift-storage-0" (UID: "ace2c1c5-31ae-43db-891a-6a587176c215") : configmap "swift-ring-files" not found Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.834705 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7v7l\" (UniqueName: \"kubernetes.io/projected/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-kube-api-access-c7v7l\") pod \"swift-ring-rebalance-2tq6c\" (UID: \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\") " pod="glance-kuttl-tests/swift-ring-rebalance-2tq6c" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.935908 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-swiftconf\") pod \"swift-ring-rebalance-2tq6c\" (UID: \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\") " pod="glance-kuttl-tests/swift-ring-rebalance-2tq6c" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.935968 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f25398c6-78b6-4b82-b3bf-ac1037e47998-scripts\") pod \"swift-ring-rebalance-m7p8n\" (UID: \"f25398c6-78b6-4b82-b3bf-ac1037e47998\") " pod="glance-kuttl-tests/swift-ring-rebalance-m7p8n" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.936004 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f25398c6-78b6-4b82-b3bf-ac1037e47998-ring-data-devices\") pod \"swift-ring-rebalance-m7p8n\" (UID: \"f25398c6-78b6-4b82-b3bf-ac1037e47998\") " pod="glance-kuttl-tests/swift-ring-rebalance-m7p8n" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.936045 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-etc-swift\") pod \"swift-ring-rebalance-2tq6c\" (UID: \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\") " pod="glance-kuttl-tests/swift-ring-rebalance-2tq6c" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.936079 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7csf\" (UniqueName: \"kubernetes.io/projected/f25398c6-78b6-4b82-b3bf-ac1037e47998-kube-api-access-w7csf\") pod \"swift-ring-rebalance-m7p8n\" (UID: \"f25398c6-78b6-4b82-b3bf-ac1037e47998\") " pod="glance-kuttl-tests/swift-ring-rebalance-m7p8n" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.936107 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f25398c6-78b6-4b82-b3bf-ac1037e47998-etc-swift\") pod \"swift-ring-rebalance-m7p8n\" (UID: \"f25398c6-78b6-4b82-b3bf-ac1037e47998\") " pod="glance-kuttl-tests/swift-ring-rebalance-m7p8n" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.936148 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f25398c6-78b6-4b82-b3bf-ac1037e47998-swiftconf\") pod \"swift-ring-rebalance-m7p8n\" (UID: \"f25398c6-78b6-4b82-b3bf-ac1037e47998\") " pod="glance-kuttl-tests/swift-ring-rebalance-m7p8n" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.936193 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-dispersionconf\") pod \"swift-ring-rebalance-2tq6c\" (UID: \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\") " pod="glance-kuttl-tests/swift-ring-rebalance-2tq6c" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.936336 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-ring-data-devices\") pod \"swift-ring-rebalance-2tq6c\" (UID: \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\") " pod="glance-kuttl-tests/swift-ring-rebalance-2tq6c" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.936422 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f25398c6-78b6-4b82-b3bf-ac1037e47998-dispersionconf\") pod \"swift-ring-rebalance-m7p8n\" (UID: \"f25398c6-78b6-4b82-b3bf-ac1037e47998\") " pod="glance-kuttl-tests/swift-ring-rebalance-m7p8n" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.936491 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-scripts\") pod \"swift-ring-rebalance-2tq6c\" (UID: \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\") " pod="glance-kuttl-tests/swift-ring-rebalance-2tq6c" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.936533 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-etc-swift\") pod \"swift-ring-rebalance-2tq6c\" (UID: \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\") " pod="glance-kuttl-tests/swift-ring-rebalance-2tq6c" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.936537 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7v7l\" (UniqueName: \"kubernetes.io/projected/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-kube-api-access-c7v7l\") pod \"swift-ring-rebalance-2tq6c\" (UID: \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\") " pod="glance-kuttl-tests/swift-ring-rebalance-2tq6c" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.937110 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-ring-data-devices\") pod \"swift-ring-rebalance-2tq6c\" (UID: \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\") " pod="glance-kuttl-tests/swift-ring-rebalance-2tq6c" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.937198 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-scripts\") pod \"swift-ring-rebalance-2tq6c\" (UID: \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\") " pod="glance-kuttl-tests/swift-ring-rebalance-2tq6c" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.941271 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-dispersionconf\") pod \"swift-ring-rebalance-2tq6c\" (UID: \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\") " pod="glance-kuttl-tests/swift-ring-rebalance-2tq6c" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.941591 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-swiftconf\") pod \"swift-ring-rebalance-2tq6c\" (UID: \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\") " pod="glance-kuttl-tests/swift-ring-rebalance-2tq6c" Nov 25 15:52:41 crc kubenswrapper[4704]: I1125 15:52:41.963429 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7v7l\" (UniqueName: \"kubernetes.io/projected/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-kube-api-access-c7v7l\") pod \"swift-ring-rebalance-2tq6c\" (UID: \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\") " pod="glance-kuttl-tests/swift-ring-rebalance-2tq6c" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.022198 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5d4c77ff9c-lgm4t" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.038354 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f25398c6-78b6-4b82-b3bf-ac1037e47998-scripts\") pod \"swift-ring-rebalance-m7p8n\" (UID: \"f25398c6-78b6-4b82-b3bf-ac1037e47998\") " pod="glance-kuttl-tests/swift-ring-rebalance-m7p8n" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.038410 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f25398c6-78b6-4b82-b3bf-ac1037e47998-ring-data-devices\") pod \"swift-ring-rebalance-m7p8n\" (UID: \"f25398c6-78b6-4b82-b3bf-ac1037e47998\") " pod="glance-kuttl-tests/swift-ring-rebalance-m7p8n" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.038456 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7csf\" (UniqueName: \"kubernetes.io/projected/f25398c6-78b6-4b82-b3bf-ac1037e47998-kube-api-access-w7csf\") pod \"swift-ring-rebalance-m7p8n\" (UID: \"f25398c6-78b6-4b82-b3bf-ac1037e47998\") " pod="glance-kuttl-tests/swift-ring-rebalance-m7p8n" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.038491 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f25398c6-78b6-4b82-b3bf-ac1037e47998-etc-swift\") pod \"swift-ring-rebalance-m7p8n\" (UID: \"f25398c6-78b6-4b82-b3bf-ac1037e47998\") " pod="glance-kuttl-tests/swift-ring-rebalance-m7p8n" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.038529 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f25398c6-78b6-4b82-b3bf-ac1037e47998-swiftconf\") pod \"swift-ring-rebalance-m7p8n\" (UID: \"f25398c6-78b6-4b82-b3bf-ac1037e47998\") " pod="glance-kuttl-tests/swift-ring-rebalance-m7p8n" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.038579 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f25398c6-78b6-4b82-b3bf-ac1037e47998-dispersionconf\") pod \"swift-ring-rebalance-m7p8n\" (UID: \"f25398c6-78b6-4b82-b3bf-ac1037e47998\") " pod="glance-kuttl-tests/swift-ring-rebalance-m7p8n" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.040004 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f25398c6-78b6-4b82-b3bf-ac1037e47998-ring-data-devices\") pod \"swift-ring-rebalance-m7p8n\" (UID: \"f25398c6-78b6-4b82-b3bf-ac1037e47998\") " pod="glance-kuttl-tests/swift-ring-rebalance-m7p8n" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.040241 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f25398c6-78b6-4b82-b3bf-ac1037e47998-etc-swift\") pod \"swift-ring-rebalance-m7p8n\" (UID: \"f25398c6-78b6-4b82-b3bf-ac1037e47998\") " pod="glance-kuttl-tests/swift-ring-rebalance-m7p8n" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.040329 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f25398c6-78b6-4b82-b3bf-ac1037e47998-scripts\") pod \"swift-ring-rebalance-m7p8n\" (UID: \"f25398c6-78b6-4b82-b3bf-ac1037e47998\") " pod="glance-kuttl-tests/swift-ring-rebalance-m7p8n" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.047418 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f25398c6-78b6-4b82-b3bf-ac1037e47998-dispersionconf\") pod \"swift-ring-rebalance-m7p8n\" (UID: \"f25398c6-78b6-4b82-b3bf-ac1037e47998\") " pod="glance-kuttl-tests/swift-ring-rebalance-m7p8n" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.052246 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f25398c6-78b6-4b82-b3bf-ac1037e47998-swiftconf\") pod \"swift-ring-rebalance-m7p8n\" (UID: \"f25398c6-78b6-4b82-b3bf-ac1037e47998\") " pod="glance-kuttl-tests/swift-ring-rebalance-m7p8n" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.057215 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7csf\" (UniqueName: \"kubernetes.io/projected/f25398c6-78b6-4b82-b3bf-ac1037e47998-kube-api-access-w7csf\") pod \"swift-ring-rebalance-m7p8n\" (UID: \"f25398c6-78b6-4b82-b3bf-ac1037e47998\") " pod="glance-kuttl-tests/swift-ring-rebalance-m7p8n" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.105089 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-m7p8n" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.165666 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-2tq6c" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.176191 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-2tq6c" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.344397 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-etc-swift\") pod \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\" (UID: \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\") " Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.344516 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-dispersionconf\") pod \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\" (UID: \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\") " Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.344596 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7v7l\" (UniqueName: \"kubernetes.io/projected/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-kube-api-access-c7v7l\") pod \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\" (UID: \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\") " Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.344667 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-scripts\") pod \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\" (UID: \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\") " Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.344689 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-swiftconf\") pod \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\" (UID: \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\") " Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.344761 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-ring-data-devices\") pod \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\" (UID: \"0a24ed77-9712-40ce-958a-d8aa03c0ba2f\") " Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.346002 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "0a24ed77-9712-40ce-958a-d8aa03c0ba2f" (UID: "0a24ed77-9712-40ce-958a-d8aa03c0ba2f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.346274 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0a24ed77-9712-40ce-958a-d8aa03c0ba2f" (UID: "0a24ed77-9712-40ce-958a-d8aa03c0ba2f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.346922 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-scripts" (OuterVolumeSpecName: "scripts") pod "0a24ed77-9712-40ce-958a-d8aa03c0ba2f" (UID: "0a24ed77-9712-40ce-958a-d8aa03c0ba2f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.354606 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-kube-api-access-c7v7l" (OuterVolumeSpecName: "kube-api-access-c7v7l") pod "0a24ed77-9712-40ce-958a-d8aa03c0ba2f" (UID: "0a24ed77-9712-40ce-958a-d8aa03c0ba2f"). InnerVolumeSpecName "kube-api-access-c7v7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.356530 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "0a24ed77-9712-40ce-958a-d8aa03c0ba2f" (UID: "0a24ed77-9712-40ce-958a-d8aa03c0ba2f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.361565 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "0a24ed77-9712-40ce-958a-d8aa03c0ba2f" (UID: "0a24ed77-9712-40ce-958a-d8aa03c0ba2f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.446042 4704 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.446086 4704 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-dispersionconf\") on node \"crc\" DevicePath \"\"" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.446099 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7v7l\" (UniqueName: \"kubernetes.io/projected/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-kube-api-access-c7v7l\") on node \"crc\" DevicePath \"\"" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.446108 4704 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.446116 4704 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-swiftconf\") on node \"crc\" DevicePath \"\"" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.446124 4704 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0a24ed77-9712-40ce-958a-d8aa03c0ba2f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.548799 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-m7p8n"] Nov 25 15:52:42 crc kubenswrapper[4704]: W1125 15:52:42.560104 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf25398c6_78b6_4b82_b3bf_ac1037e47998.slice/crio-a3235188b75ea97f4d88ae4a98d2d37cc4938ef6666f4722440d14110c8fd18d WatchSource:0}: Error finding container a3235188b75ea97f4d88ae4a98d2d37cc4938ef6666f4722440d14110c8fd18d: Status 404 returned error can't find the container with id a3235188b75ea97f4d88ae4a98d2d37cc4938ef6666f4722440d14110c8fd18d Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.801875 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-index-ph68v"] Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.802687 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-ph68v" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.804899 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-index-dockercfg-4z8g5" Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.813462 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-ph68v"] Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.852636 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ace2c1c5-31ae-43db-891a-6a587176c215-etc-swift\") pod \"swift-storage-0\" (UID: \"ace2c1c5-31ae-43db-891a-6a587176c215\") " pod="glance-kuttl-tests/swift-storage-0" Nov 25 15:52:42 crc kubenswrapper[4704]: E1125 15:52:42.852986 4704 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 25 15:52:42 crc kubenswrapper[4704]: E1125 15:52:42.853030 4704 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Nov 25 15:52:42 crc kubenswrapper[4704]: E1125 15:52:42.853104 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ace2c1c5-31ae-43db-891a-6a587176c215-etc-swift podName:ace2c1c5-31ae-43db-891a-6a587176c215 nodeName:}" failed. No retries permitted until 2025-11-25 15:52:44.853078285 +0000 UTC m=+1051.121352066 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ace2c1c5-31ae-43db-891a-6a587176c215-etc-swift") pod "swift-storage-0" (UID: "ace2c1c5-31ae-43db-891a-6a587176c215") : configmap "swift-ring-files" not found Nov 25 15:52:42 crc kubenswrapper[4704]: I1125 15:52:42.954578 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz2c2\" (UniqueName: \"kubernetes.io/projected/d2af75d4-d76a-48ea-baa4-0ce23b299e48-kube-api-access-bz2c2\") pod \"glance-operator-index-ph68v\" (UID: \"d2af75d4-d76a-48ea-baa4-0ce23b299e48\") " pod="openstack-operators/glance-operator-index-ph68v" Nov 25 15:52:43 crc kubenswrapper[4704]: I1125 15:52:43.056449 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz2c2\" (UniqueName: \"kubernetes.io/projected/d2af75d4-d76a-48ea-baa4-0ce23b299e48-kube-api-access-bz2c2\") pod \"glance-operator-index-ph68v\" (UID: \"d2af75d4-d76a-48ea-baa4-0ce23b299e48\") " pod="openstack-operators/glance-operator-index-ph68v" Nov 25 15:52:43 crc kubenswrapper[4704]: I1125 15:52:43.080010 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz2c2\" (UniqueName: \"kubernetes.io/projected/d2af75d4-d76a-48ea-baa4-0ce23b299e48-kube-api-access-bz2c2\") pod \"glance-operator-index-ph68v\" (UID: \"d2af75d4-d76a-48ea-baa4-0ce23b299e48\") " pod="openstack-operators/glance-operator-index-ph68v" Nov 25 15:52:43 crc kubenswrapper[4704]: I1125 15:52:43.120113 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-ph68v" Nov 25 15:52:43 crc kubenswrapper[4704]: I1125 15:52:43.203939 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-m7p8n" event={"ID":"f25398c6-78b6-4b82-b3bf-ac1037e47998","Type":"ContainerStarted","Data":"a3235188b75ea97f4d88ae4a98d2d37cc4938ef6666f4722440d14110c8fd18d"} Nov 25 15:52:43 crc kubenswrapper[4704]: I1125 15:52:43.203993 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-2tq6c" Nov 25 15:52:43 crc kubenswrapper[4704]: I1125 15:52:43.273489 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-2tq6c"] Nov 25 15:52:43 crc kubenswrapper[4704]: I1125 15:52:43.286147 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-2tq6c"] Nov 25 15:52:43 crc kubenswrapper[4704]: I1125 15:52:43.625738 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-ph68v"] Nov 25 15:52:44 crc kubenswrapper[4704]: I1125 15:52:44.213451 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-ph68v" event={"ID":"d2af75d4-d76a-48ea-baa4-0ce23b299e48","Type":"ContainerStarted","Data":"9ab9a0de20f9f6e5ecefb24217bb5f1cee801712ca293c165ad44f38a85190f6"} Nov 25 15:52:44 crc kubenswrapper[4704]: I1125 15:52:44.426067 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a24ed77-9712-40ce-958a-d8aa03c0ba2f" path="/var/lib/kubelet/pods/0a24ed77-9712-40ce-958a-d8aa03c0ba2f/volumes" Nov 25 15:52:44 crc kubenswrapper[4704]: I1125 15:52:44.426427 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch"] Nov 25 15:52:44 crc kubenswrapper[4704]: I1125 15:52:44.427633 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" Nov 25 15:52:44 crc kubenswrapper[4704]: I1125 15:52:44.440005 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch"] Nov 25 15:52:44 crc kubenswrapper[4704]: I1125 15:52:44.580246 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/06d09490-29de-42d3-a33f-067f3c9ba573-etc-swift\") pod \"swift-proxy-6bd58cfcf7-ctjch\" (UID: \"06d09490-29de-42d3-a33f-067f3c9ba573\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" Nov 25 15:52:44 crc kubenswrapper[4704]: I1125 15:52:44.580317 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06d09490-29de-42d3-a33f-067f3c9ba573-run-httpd\") pod \"swift-proxy-6bd58cfcf7-ctjch\" (UID: \"06d09490-29de-42d3-a33f-067f3c9ba573\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" Nov 25 15:52:44 crc kubenswrapper[4704]: I1125 15:52:44.580357 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j55x6\" (UniqueName: \"kubernetes.io/projected/06d09490-29de-42d3-a33f-067f3c9ba573-kube-api-access-j55x6\") pod \"swift-proxy-6bd58cfcf7-ctjch\" (UID: \"06d09490-29de-42d3-a33f-067f3c9ba573\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" Nov 25 15:52:44 crc kubenswrapper[4704]: I1125 15:52:44.580567 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06d09490-29de-42d3-a33f-067f3c9ba573-config-data\") pod \"swift-proxy-6bd58cfcf7-ctjch\" (UID: \"06d09490-29de-42d3-a33f-067f3c9ba573\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" Nov 25 15:52:44 crc kubenswrapper[4704]: I1125 15:52:44.580659 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06d09490-29de-42d3-a33f-067f3c9ba573-log-httpd\") pod \"swift-proxy-6bd58cfcf7-ctjch\" (UID: \"06d09490-29de-42d3-a33f-067f3c9ba573\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" Nov 25 15:52:44 crc kubenswrapper[4704]: I1125 15:52:44.681710 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j55x6\" (UniqueName: \"kubernetes.io/projected/06d09490-29de-42d3-a33f-067f3c9ba573-kube-api-access-j55x6\") pod \"swift-proxy-6bd58cfcf7-ctjch\" (UID: \"06d09490-29de-42d3-a33f-067f3c9ba573\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" Nov 25 15:52:44 crc kubenswrapper[4704]: I1125 15:52:44.681860 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06d09490-29de-42d3-a33f-067f3c9ba573-config-data\") pod \"swift-proxy-6bd58cfcf7-ctjch\" (UID: \"06d09490-29de-42d3-a33f-067f3c9ba573\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" Nov 25 15:52:44 crc kubenswrapper[4704]: I1125 15:52:44.681895 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06d09490-29de-42d3-a33f-067f3c9ba573-log-httpd\") pod \"swift-proxy-6bd58cfcf7-ctjch\" (UID: \"06d09490-29de-42d3-a33f-067f3c9ba573\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" Nov 25 15:52:44 crc kubenswrapper[4704]: I1125 15:52:44.681968 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/06d09490-29de-42d3-a33f-067f3c9ba573-etc-swift\") pod \"swift-proxy-6bd58cfcf7-ctjch\" (UID: \"06d09490-29de-42d3-a33f-067f3c9ba573\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" Nov 25 15:52:44 crc kubenswrapper[4704]: I1125 15:52:44.681996 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06d09490-29de-42d3-a33f-067f3c9ba573-run-httpd\") pod \"swift-proxy-6bd58cfcf7-ctjch\" (UID: \"06d09490-29de-42d3-a33f-067f3c9ba573\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" Nov 25 15:52:44 crc kubenswrapper[4704]: E1125 15:52:44.682138 4704 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 25 15:52:44 crc kubenswrapper[4704]: E1125 15:52:44.682162 4704 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch: configmap "swift-ring-files" not found Nov 25 15:52:44 crc kubenswrapper[4704]: E1125 15:52:44.682216 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/06d09490-29de-42d3-a33f-067f3c9ba573-etc-swift podName:06d09490-29de-42d3-a33f-067f3c9ba573 nodeName:}" failed. No retries permitted until 2025-11-25 15:52:45.182199437 +0000 UTC m=+1051.450473218 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/06d09490-29de-42d3-a33f-067f3c9ba573-etc-swift") pod "swift-proxy-6bd58cfcf7-ctjch" (UID: "06d09490-29de-42d3-a33f-067f3c9ba573") : configmap "swift-ring-files" not found Nov 25 15:52:44 crc kubenswrapper[4704]: I1125 15:52:44.682689 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06d09490-29de-42d3-a33f-067f3c9ba573-run-httpd\") pod \"swift-proxy-6bd58cfcf7-ctjch\" (UID: \"06d09490-29de-42d3-a33f-067f3c9ba573\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" Nov 25 15:52:44 crc kubenswrapper[4704]: I1125 15:52:44.682721 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06d09490-29de-42d3-a33f-067f3c9ba573-log-httpd\") pod \"swift-proxy-6bd58cfcf7-ctjch\" (UID: \"06d09490-29de-42d3-a33f-067f3c9ba573\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" Nov 25 15:52:44 crc kubenswrapper[4704]: I1125 15:52:44.701224 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06d09490-29de-42d3-a33f-067f3c9ba573-config-data\") pod \"swift-proxy-6bd58cfcf7-ctjch\" (UID: \"06d09490-29de-42d3-a33f-067f3c9ba573\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" Nov 25 15:52:44 crc kubenswrapper[4704]: I1125 15:52:44.707675 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j55x6\" (UniqueName: \"kubernetes.io/projected/06d09490-29de-42d3-a33f-067f3c9ba573-kube-api-access-j55x6\") pod \"swift-proxy-6bd58cfcf7-ctjch\" (UID: \"06d09490-29de-42d3-a33f-067f3c9ba573\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" Nov 25 15:52:44 crc kubenswrapper[4704]: I1125 15:52:44.893766 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ace2c1c5-31ae-43db-891a-6a587176c215-etc-swift\") pod \"swift-storage-0\" (UID: \"ace2c1c5-31ae-43db-891a-6a587176c215\") " pod="glance-kuttl-tests/swift-storage-0" Nov 25 15:52:44 crc kubenswrapper[4704]: E1125 15:52:44.894015 4704 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 25 15:52:44 crc kubenswrapper[4704]: E1125 15:52:44.894447 4704 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Nov 25 15:52:44 crc kubenswrapper[4704]: E1125 15:52:44.894518 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ace2c1c5-31ae-43db-891a-6a587176c215-etc-swift podName:ace2c1c5-31ae-43db-891a-6a587176c215 nodeName:}" failed. No retries permitted until 2025-11-25 15:52:48.894496771 +0000 UTC m=+1055.162770552 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ace2c1c5-31ae-43db-891a-6a587176c215-etc-swift") pod "swift-storage-0" (UID: "ace2c1c5-31ae-43db-891a-6a587176c215") : configmap "swift-ring-files" not found Nov 25 15:52:45 crc kubenswrapper[4704]: I1125 15:52:45.198610 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/06d09490-29de-42d3-a33f-067f3c9ba573-etc-swift\") pod \"swift-proxy-6bd58cfcf7-ctjch\" (UID: \"06d09490-29de-42d3-a33f-067f3c9ba573\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" Nov 25 15:52:45 crc kubenswrapper[4704]: E1125 15:52:45.198828 4704 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 25 15:52:45 crc kubenswrapper[4704]: E1125 15:52:45.198848 4704 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch: configmap "swift-ring-files" not found Nov 25 15:52:45 crc kubenswrapper[4704]: E1125 15:52:45.198896 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/06d09490-29de-42d3-a33f-067f3c9ba573-etc-swift podName:06d09490-29de-42d3-a33f-067f3c9ba573 nodeName:}" failed. No retries permitted until 2025-11-25 15:52:46.198880621 +0000 UTC m=+1052.467154402 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/06d09490-29de-42d3-a33f-067f3c9ba573-etc-swift") pod "swift-proxy-6bd58cfcf7-ctjch" (UID: "06d09490-29de-42d3-a33f-067f3c9ba573") : configmap "swift-ring-files" not found Nov 25 15:52:46 crc kubenswrapper[4704]: I1125 15:52:46.238595 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/06d09490-29de-42d3-a33f-067f3c9ba573-etc-swift\") pod \"swift-proxy-6bd58cfcf7-ctjch\" (UID: \"06d09490-29de-42d3-a33f-067f3c9ba573\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" Nov 25 15:52:46 crc kubenswrapper[4704]: E1125 15:52:46.238742 4704 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 25 15:52:46 crc kubenswrapper[4704]: E1125 15:52:46.238812 4704 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch: configmap "swift-ring-files" not found Nov 25 15:52:46 crc kubenswrapper[4704]: E1125 15:52:46.238862 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/06d09490-29de-42d3-a33f-067f3c9ba573-etc-swift podName:06d09490-29de-42d3-a33f-067f3c9ba573 nodeName:}" failed. No retries permitted until 2025-11-25 15:52:48.23884501 +0000 UTC m=+1054.507118791 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/06d09490-29de-42d3-a33f-067f3c9ba573-etc-swift") pod "swift-proxy-6bd58cfcf7-ctjch" (UID: "06d09490-29de-42d3-a33f-067f3c9ba573") : configmap "swift-ring-files" not found Nov 25 15:52:48 crc kubenswrapper[4704]: I1125 15:52:48.266764 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/06d09490-29de-42d3-a33f-067f3c9ba573-etc-swift\") pod \"swift-proxy-6bd58cfcf7-ctjch\" (UID: \"06d09490-29de-42d3-a33f-067f3c9ba573\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" Nov 25 15:52:48 crc kubenswrapper[4704]: E1125 15:52:48.266938 4704 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 25 15:52:48 crc kubenswrapper[4704]: E1125 15:52:48.267151 4704 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch: configmap "swift-ring-files" not found Nov 25 15:52:48 crc kubenswrapper[4704]: E1125 15:52:48.267194 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/06d09490-29de-42d3-a33f-067f3c9ba573-etc-swift podName:06d09490-29de-42d3-a33f-067f3c9ba573 nodeName:}" failed. No retries permitted until 2025-11-25 15:52:52.267179958 +0000 UTC m=+1058.535453739 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/06d09490-29de-42d3-a33f-067f3c9ba573-etc-swift") pod "swift-proxy-6bd58cfcf7-ctjch" (UID: "06d09490-29de-42d3-a33f-067f3c9ba573") : configmap "swift-ring-files" not found Nov 25 15:52:48 crc kubenswrapper[4704]: I1125 15:52:48.976511 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ace2c1c5-31ae-43db-891a-6a587176c215-etc-swift\") pod \"swift-storage-0\" (UID: \"ace2c1c5-31ae-43db-891a-6a587176c215\") " pod="glance-kuttl-tests/swift-storage-0" Nov 25 15:52:48 crc kubenswrapper[4704]: E1125 15:52:48.976712 4704 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 25 15:52:48 crc kubenswrapper[4704]: E1125 15:52:48.976731 4704 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Nov 25 15:52:48 crc kubenswrapper[4704]: E1125 15:52:48.976805 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ace2c1c5-31ae-43db-891a-6a587176c215-etc-swift podName:ace2c1c5-31ae-43db-891a-6a587176c215 nodeName:}" failed. No retries permitted until 2025-11-25 15:52:56.976772426 +0000 UTC m=+1063.245046207 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ace2c1c5-31ae-43db-891a-6a587176c215-etc-swift") pod "swift-storage-0" (UID: "ace2c1c5-31ae-43db-891a-6a587176c215") : configmap "swift-ring-files" not found Nov 25 15:52:50 crc kubenswrapper[4704]: I1125 15:52:50.258883 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-m7p8n" event={"ID":"f25398c6-78b6-4b82-b3bf-ac1037e47998","Type":"ContainerStarted","Data":"fc8856d6dd3401c5f95b4cfa50c296f7a17b70c1d89c2a27b72c2e3651d39677"} Nov 25 15:52:50 crc kubenswrapper[4704]: I1125 15:52:50.281523 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-ring-rebalance-m7p8n" podStartSLOduration=2.582646401 podStartE2EDuration="9.281507158s" podCreationTimestamp="2025-11-25 15:52:41 +0000 UTC" firstStartedPulling="2025-11-25 15:52:42.562296379 +0000 UTC m=+1048.830570160" lastFinishedPulling="2025-11-25 15:52:49.261157136 +0000 UTC m=+1055.529430917" observedRunningTime="2025-11-25 15:52:50.275549936 +0000 UTC m=+1056.543823717" watchObservedRunningTime="2025-11-25 15:52:50.281507158 +0000 UTC m=+1056.549780939" Nov 25 15:52:52 crc kubenswrapper[4704]: I1125 15:52:52.337198 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/06d09490-29de-42d3-a33f-067f3c9ba573-etc-swift\") pod \"swift-proxy-6bd58cfcf7-ctjch\" (UID: \"06d09490-29de-42d3-a33f-067f3c9ba573\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" Nov 25 15:52:52 crc kubenswrapper[4704]: E1125 15:52:52.337430 4704 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Nov 25 15:52:52 crc kubenswrapper[4704]: E1125 15:52:52.337765 4704 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch: configmap "swift-ring-files" not found Nov 25 15:52:52 crc kubenswrapper[4704]: E1125 15:52:52.337852 4704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/06d09490-29de-42d3-a33f-067f3c9ba573-etc-swift podName:06d09490-29de-42d3-a33f-067f3c9ba573 nodeName:}" failed. No retries permitted until 2025-11-25 15:53:00.337835536 +0000 UTC m=+1066.606109317 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/06d09490-29de-42d3-a33f-067f3c9ba573-etc-swift") pod "swift-proxy-6bd58cfcf7-ctjch" (UID: "06d09490-29de-42d3-a33f-067f3c9ba573") : configmap "swift-ring-files" not found Nov 25 15:52:53 crc kubenswrapper[4704]: I1125 15:52:53.284368 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-ph68v" event={"ID":"d2af75d4-d76a-48ea-baa4-0ce23b299e48","Type":"ContainerStarted","Data":"bd0be7ed597f5f5f35f6b6034b7a5ce8ef344d178d96818ded60093cc677cc56"} Nov 25 15:52:53 crc kubenswrapper[4704]: I1125 15:52:53.303275 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-index-ph68v" podStartSLOduration=2.621820548 podStartE2EDuration="11.303242967s" podCreationTimestamp="2025-11-25 15:52:42 +0000 UTC" firstStartedPulling="2025-11-25 15:52:43.645060428 +0000 UTC m=+1049.913334209" lastFinishedPulling="2025-11-25 15:52:52.326482837 +0000 UTC m=+1058.594756628" observedRunningTime="2025-11-25 15:52:53.295357049 +0000 UTC m=+1059.563630840" watchObservedRunningTime="2025-11-25 15:52:53.303242967 +0000 UTC m=+1059.571516768" Nov 25 15:52:57 crc kubenswrapper[4704]: I1125 15:52:57.025893 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ace2c1c5-31ae-43db-891a-6a587176c215-etc-swift\") pod \"swift-storage-0\" (UID: \"ace2c1c5-31ae-43db-891a-6a587176c215\") " pod="glance-kuttl-tests/swift-storage-0" Nov 25 15:52:57 crc kubenswrapper[4704]: I1125 15:52:57.036188 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ace2c1c5-31ae-43db-891a-6a587176c215-etc-swift\") pod \"swift-storage-0\" (UID: \"ace2c1c5-31ae-43db-891a-6a587176c215\") " pod="glance-kuttl-tests/swift-storage-0" Nov 25 15:52:57 crc kubenswrapper[4704]: I1125 15:52:57.047759 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Nov 25 15:52:57 crc kubenswrapper[4704]: I1125 15:52:57.310630 4704 generic.go:334] "Generic (PLEG): container finished" podID="f25398c6-78b6-4b82-b3bf-ac1037e47998" containerID="fc8856d6dd3401c5f95b4cfa50c296f7a17b70c1d89c2a27b72c2e3651d39677" exitCode=0 Nov 25 15:52:57 crc kubenswrapper[4704]: I1125 15:52:57.311405 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-m7p8n" event={"ID":"f25398c6-78b6-4b82-b3bf-ac1037e47998","Type":"ContainerDied","Data":"fc8856d6dd3401c5f95b4cfa50c296f7a17b70c1d89c2a27b72c2e3651d39677"} Nov 25 15:52:57 crc kubenswrapper[4704]: I1125 15:52:57.600909 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Nov 25 15:52:58 crc kubenswrapper[4704]: I1125 15:52:58.330925 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ace2c1c5-31ae-43db-891a-6a587176c215","Type":"ContainerStarted","Data":"e40a742397577f41f597e2cdfd6e0461d242936041dd4c27233d8c31db0999a8"} Nov 25 15:52:58 crc kubenswrapper[4704]: I1125 15:52:58.624039 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-m7p8n" Nov 25 15:52:58 crc kubenswrapper[4704]: I1125 15:52:58.654651 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f25398c6-78b6-4b82-b3bf-ac1037e47998-ring-data-devices\") pod \"f25398c6-78b6-4b82-b3bf-ac1037e47998\" (UID: \"f25398c6-78b6-4b82-b3bf-ac1037e47998\") " Nov 25 15:52:58 crc kubenswrapper[4704]: I1125 15:52:58.654725 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f25398c6-78b6-4b82-b3bf-ac1037e47998-swiftconf\") pod \"f25398c6-78b6-4b82-b3bf-ac1037e47998\" (UID: \"f25398c6-78b6-4b82-b3bf-ac1037e47998\") " Nov 25 15:52:58 crc kubenswrapper[4704]: I1125 15:52:58.655415 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f25398c6-78b6-4b82-b3bf-ac1037e47998-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f25398c6-78b6-4b82-b3bf-ac1037e47998" (UID: "f25398c6-78b6-4b82-b3bf-ac1037e47998"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:52:58 crc kubenswrapper[4704]: I1125 15:52:58.658846 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f25398c6-78b6-4b82-b3bf-ac1037e47998-scripts\") pod \"f25398c6-78b6-4b82-b3bf-ac1037e47998\" (UID: \"f25398c6-78b6-4b82-b3bf-ac1037e47998\") " Nov 25 15:52:58 crc kubenswrapper[4704]: I1125 15:52:58.658984 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f25398c6-78b6-4b82-b3bf-ac1037e47998-etc-swift\") pod \"f25398c6-78b6-4b82-b3bf-ac1037e47998\" (UID: \"f25398c6-78b6-4b82-b3bf-ac1037e47998\") " Nov 25 15:52:58 crc kubenswrapper[4704]: I1125 15:52:58.659027 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f25398c6-78b6-4b82-b3bf-ac1037e47998-dispersionconf\") pod \"f25398c6-78b6-4b82-b3bf-ac1037e47998\" (UID: \"f25398c6-78b6-4b82-b3bf-ac1037e47998\") " Nov 25 15:52:58 crc kubenswrapper[4704]: I1125 15:52:58.659064 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7csf\" (UniqueName: \"kubernetes.io/projected/f25398c6-78b6-4b82-b3bf-ac1037e47998-kube-api-access-w7csf\") pod \"f25398c6-78b6-4b82-b3bf-ac1037e47998\" (UID: \"f25398c6-78b6-4b82-b3bf-ac1037e47998\") " Nov 25 15:52:58 crc kubenswrapper[4704]: I1125 15:52:58.659755 4704 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f25398c6-78b6-4b82-b3bf-ac1037e47998-ring-data-devices\") on node \"crc\" DevicePath \"\"" Nov 25 15:52:58 crc kubenswrapper[4704]: I1125 15:52:58.662339 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f25398c6-78b6-4b82-b3bf-ac1037e47998-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f25398c6-78b6-4b82-b3bf-ac1037e47998" (UID: "f25398c6-78b6-4b82-b3bf-ac1037e47998"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:52:58 crc kubenswrapper[4704]: I1125 15:52:58.664231 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f25398c6-78b6-4b82-b3bf-ac1037e47998-kube-api-access-w7csf" (OuterVolumeSpecName: "kube-api-access-w7csf") pod "f25398c6-78b6-4b82-b3bf-ac1037e47998" (UID: "f25398c6-78b6-4b82-b3bf-ac1037e47998"). InnerVolumeSpecName "kube-api-access-w7csf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:52:58 crc kubenswrapper[4704]: I1125 15:52:58.670746 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f25398c6-78b6-4b82-b3bf-ac1037e47998-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f25398c6-78b6-4b82-b3bf-ac1037e47998" (UID: "f25398c6-78b6-4b82-b3bf-ac1037e47998"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:52:58 crc kubenswrapper[4704]: I1125 15:52:58.675208 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f25398c6-78b6-4b82-b3bf-ac1037e47998-scripts" (OuterVolumeSpecName: "scripts") pod "f25398c6-78b6-4b82-b3bf-ac1037e47998" (UID: "f25398c6-78b6-4b82-b3bf-ac1037e47998"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:52:58 crc kubenswrapper[4704]: I1125 15:52:58.677758 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f25398c6-78b6-4b82-b3bf-ac1037e47998-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f25398c6-78b6-4b82-b3bf-ac1037e47998" (UID: "f25398c6-78b6-4b82-b3bf-ac1037e47998"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:52:58 crc kubenswrapper[4704]: I1125 15:52:58.761315 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7csf\" (UniqueName: \"kubernetes.io/projected/f25398c6-78b6-4b82-b3bf-ac1037e47998-kube-api-access-w7csf\") on node \"crc\" DevicePath \"\"" Nov 25 15:52:58 crc kubenswrapper[4704]: I1125 15:52:58.761904 4704 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f25398c6-78b6-4b82-b3bf-ac1037e47998-swiftconf\") on node \"crc\" DevicePath \"\"" Nov 25 15:52:58 crc kubenswrapper[4704]: I1125 15:52:58.761919 4704 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f25398c6-78b6-4b82-b3bf-ac1037e47998-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:52:58 crc kubenswrapper[4704]: I1125 15:52:58.761931 4704 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f25398c6-78b6-4b82-b3bf-ac1037e47998-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 25 15:52:58 crc kubenswrapper[4704]: I1125 15:52:58.761947 4704 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f25398c6-78b6-4b82-b3bf-ac1037e47998-dispersionconf\") on node \"crc\" DevicePath \"\"" Nov 25 15:52:59 crc kubenswrapper[4704]: I1125 15:52:59.340195 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ace2c1c5-31ae-43db-891a-6a587176c215","Type":"ContainerStarted","Data":"cfd7eabf488ceebcf1599b801aa4c01ed5916098e09ce2c6c5a55fb4e4ce5ccb"} Nov 25 15:52:59 crc kubenswrapper[4704]: I1125 15:52:59.340265 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ace2c1c5-31ae-43db-891a-6a587176c215","Type":"ContainerStarted","Data":"9bedcd2b14878cecce21599d073a7bc2ecb2bed511480fa7051aefe450c797ac"} Nov 25 15:52:59 crc kubenswrapper[4704]: I1125 15:52:59.340277 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ace2c1c5-31ae-43db-891a-6a587176c215","Type":"ContainerStarted","Data":"676b19ac2e6f367d2ee4890739a42cdc31c2d3fc3ea46ba726a888d424e4f026"} Nov 25 15:52:59 crc kubenswrapper[4704]: I1125 15:52:59.340286 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ace2c1c5-31ae-43db-891a-6a587176c215","Type":"ContainerStarted","Data":"c676a0f9b6fa28bc08f0771a9fa8764e90565d0acadea7366088531de2c20b63"} Nov 25 15:52:59 crc kubenswrapper[4704]: I1125 15:52:59.342172 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-m7p8n" event={"ID":"f25398c6-78b6-4b82-b3bf-ac1037e47998","Type":"ContainerDied","Data":"a3235188b75ea97f4d88ae4a98d2d37cc4938ef6666f4722440d14110c8fd18d"} Nov 25 15:52:59 crc kubenswrapper[4704]: I1125 15:52:59.342202 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3235188b75ea97f4d88ae4a98d2d37cc4938ef6666f4722440d14110c8fd18d" Nov 25 15:52:59 crc kubenswrapper[4704]: I1125 15:52:59.342236 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-m7p8n" Nov 25 15:53:00 crc kubenswrapper[4704]: I1125 15:53:00.383085 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/06d09490-29de-42d3-a33f-067f3c9ba573-etc-swift\") pod \"swift-proxy-6bd58cfcf7-ctjch\" (UID: \"06d09490-29de-42d3-a33f-067f3c9ba573\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" Nov 25 15:53:00 crc kubenswrapper[4704]: I1125 15:53:00.391972 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/06d09490-29de-42d3-a33f-067f3c9ba573-etc-swift\") pod \"swift-proxy-6bd58cfcf7-ctjch\" (UID: \"06d09490-29de-42d3-a33f-067f3c9ba573\") " pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" Nov 25 15:53:00 crc kubenswrapper[4704]: I1125 15:53:00.681453 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" Nov 25 15:53:01 crc kubenswrapper[4704]: I1125 15:53:01.028691 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch"] Nov 25 15:53:01 crc kubenswrapper[4704]: I1125 15:53:01.361863 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ace2c1c5-31ae-43db-891a-6a587176c215","Type":"ContainerStarted","Data":"34d45afdd1afea240ac3dbe7e52978e07654ba159f3d4b23444a6ac95c14b095"} Nov 25 15:53:01 crc kubenswrapper[4704]: I1125 15:53:01.362393 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ace2c1c5-31ae-43db-891a-6a587176c215","Type":"ContainerStarted","Data":"1d647f9121584a3bf209b788994e403b21284f93f4184f5ee7a78833708f630e"} Nov 25 15:53:01 crc kubenswrapper[4704]: I1125 15:53:01.362408 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ace2c1c5-31ae-43db-891a-6a587176c215","Type":"ContainerStarted","Data":"79455d61b99bb5fc787e1c720e839db6a4a2377128abb83befbececc67e3f47c"} Nov 25 15:53:01 crc kubenswrapper[4704]: I1125 15:53:01.362417 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ace2c1c5-31ae-43db-891a-6a587176c215","Type":"ContainerStarted","Data":"1fb601c2d82a15aa2b3870e13730ff29ce2befd93e90ba57d12fc6d1f72b1480"} Nov 25 15:53:01 crc kubenswrapper[4704]: I1125 15:53:01.364517 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" event={"ID":"06d09490-29de-42d3-a33f-067f3c9ba573","Type":"ContainerStarted","Data":"68dcae13d7870685a29080dd61f90ef4a991369f6e80a4f759316e43882fab46"} Nov 25 15:53:01 crc kubenswrapper[4704]: I1125 15:53:01.364570 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" event={"ID":"06d09490-29de-42d3-a33f-067f3c9ba573","Type":"ContainerStarted","Data":"6ffa041afb4343f288a805afec1c96245744303e8ea9978cdc6e553b02b4aa8d"} Nov 25 15:53:01 crc kubenswrapper[4704]: I1125 15:53:01.364582 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" event={"ID":"06d09490-29de-42d3-a33f-067f3c9ba573","Type":"ContainerStarted","Data":"f0cafc0429db7549ce04b827025a2645387bf28bd4b68220a58e28036fbbc755"} Nov 25 15:53:01 crc kubenswrapper[4704]: I1125 15:53:01.364681 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" Nov 25 15:53:02 crc kubenswrapper[4704]: I1125 15:53:02.374909 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ace2c1c5-31ae-43db-891a-6a587176c215","Type":"ContainerStarted","Data":"87d73ae209434fe6dfb1ab53fbed00a7f1af259002ab4374a59e7419c0bdb600"} Nov 25 15:53:02 crc kubenswrapper[4704]: I1125 15:53:02.374967 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ace2c1c5-31ae-43db-891a-6a587176c215","Type":"ContainerStarted","Data":"eee25a6e82547fded20de99a3e14c96c3b42ced33b651024f771cd109ab47cdd"} Nov 25 15:53:02 crc kubenswrapper[4704]: I1125 15:53:02.375310 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" Nov 25 15:53:03 crc kubenswrapper[4704]: I1125 15:53:03.121424 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-index-ph68v" Nov 25 15:53:03 crc kubenswrapper[4704]: I1125 15:53:03.122035 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/glance-operator-index-ph68v" Nov 25 15:53:03 crc kubenswrapper[4704]: I1125 15:53:03.223509 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/glance-operator-index-ph68v" Nov 25 15:53:03 crc kubenswrapper[4704]: I1125 15:53:03.247737 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" podStartSLOduration=19.247715434 podStartE2EDuration="19.247715434s" podCreationTimestamp="2025-11-25 15:52:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:53:01.387906815 +0000 UTC m=+1067.656180616" watchObservedRunningTime="2025-11-25 15:53:03.247715434 +0000 UTC m=+1069.515989215" Nov 25 15:53:03 crc kubenswrapper[4704]: I1125 15:53:03.388039 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ace2c1c5-31ae-43db-891a-6a587176c215","Type":"ContainerStarted","Data":"567239abe1fc8a32a56259cd5867e8a736a4fc24f74ff20d2f4b3da686f1786f"} Nov 25 15:53:03 crc kubenswrapper[4704]: I1125 15:53:03.388097 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ace2c1c5-31ae-43db-891a-6a587176c215","Type":"ContainerStarted","Data":"dd1ef25c34e82b137b7cdc957d0f634156c512a2142d03f1053fcdd6776adaf5"} Nov 25 15:53:03 crc kubenswrapper[4704]: I1125 15:53:03.388107 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ace2c1c5-31ae-43db-891a-6a587176c215","Type":"ContainerStarted","Data":"75dd21828fe3bb21ed17cbe0e2f3f7c7383b1440aa5ef5733c9a27ba2bcb011e"} Nov 25 15:53:03 crc kubenswrapper[4704]: I1125 15:53:03.388116 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ace2c1c5-31ae-43db-891a-6a587176c215","Type":"ContainerStarted","Data":"cd44d484d9cd61e8b67792e3aadbd804d2449818dc0fcb6d173536f12e913f66"} Nov 25 15:53:03 crc kubenswrapper[4704]: I1125 15:53:03.445191 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-index-ph68v" Nov 25 15:53:04 crc kubenswrapper[4704]: I1125 15:53:04.402607 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"ace2c1c5-31ae-43db-891a-6a587176c215","Type":"ContainerStarted","Data":"8648fcfa464450453fe77a3915250c04a4507a515993a87446102830af419f34"} Nov 25 15:53:04 crc kubenswrapper[4704]: I1125 15:53:04.448123 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-storage-0" podStartSLOduration=20.092330736 podStartE2EDuration="24.448104807s" podCreationTimestamp="2025-11-25 15:52:40 +0000 UTC" firstStartedPulling="2025-11-25 15:52:57.606394175 +0000 UTC m=+1063.874667956" lastFinishedPulling="2025-11-25 15:53:01.962168246 +0000 UTC m=+1068.230442027" observedRunningTime="2025-11-25 15:53:04.439464387 +0000 UTC m=+1070.707738168" watchObservedRunningTime="2025-11-25 15:53:04.448104807 +0000 UTC m=+1070.716378588" Nov 25 15:53:07 crc kubenswrapper[4704]: I1125 15:53:07.964538 4704 patch_prober.go:28] interesting pod/machine-config-daemon-djz8x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:53:07 crc kubenswrapper[4704]: I1125 15:53:07.966216 4704 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:53:07 crc kubenswrapper[4704]: I1125 15:53:07.966348 4704 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" Nov 25 15:53:07 crc kubenswrapper[4704]: I1125 15:53:07.967146 4704 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0a8966b76dc1d40a4bda67fc26f25a19803f2f36d74b3a7ae6b45d74acb00ad9"} pod="openshift-machine-config-operator/machine-config-daemon-djz8x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 15:53:07 crc kubenswrapper[4704]: I1125 15:53:07.967367 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" containerID="cri-o://0a8966b76dc1d40a4bda67fc26f25a19803f2f36d74b3a7ae6b45d74acb00ad9" gracePeriod=600 Nov 25 15:53:08 crc kubenswrapper[4704]: I1125 15:53:08.430328 4704 generic.go:334] "Generic (PLEG): container finished" podID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerID="0a8966b76dc1d40a4bda67fc26f25a19803f2f36d74b3a7ae6b45d74acb00ad9" exitCode=0 Nov 25 15:53:08 crc kubenswrapper[4704]: I1125 15:53:08.430407 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" event={"ID":"91b52682-d008-4b8a-8bc3-26b032d7dc2c","Type":"ContainerDied","Data":"0a8966b76dc1d40a4bda67fc26f25a19803f2f36d74b3a7ae6b45d74acb00ad9"} Nov 25 15:53:08 crc kubenswrapper[4704]: I1125 15:53:08.430771 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" event={"ID":"91b52682-d008-4b8a-8bc3-26b032d7dc2c","Type":"ContainerStarted","Data":"ed4f107353622069826562153315aa9eb23b779c9df0b35ea109bbd82177caad"} Nov 25 15:53:08 crc kubenswrapper[4704]: I1125 15:53:08.430846 4704 scope.go:117] "RemoveContainer" containerID="70c340a5598fd3ac0fcb6b9ef0ce0145e436d285d89d93a8b40ff742af895c50" Nov 25 15:53:10 crc kubenswrapper[4704]: I1125 15:53:10.684846 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" Nov 25 15:53:10 crc kubenswrapper[4704]: I1125 15:53:10.687310 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/swift-proxy-6bd58cfcf7-ctjch" Nov 25 15:53:13 crc kubenswrapper[4704]: I1125 15:53:13.850437 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt"] Nov 25 15:53:13 crc kubenswrapper[4704]: E1125 15:53:13.852161 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f25398c6-78b6-4b82-b3bf-ac1037e47998" containerName="swift-ring-rebalance" Nov 25 15:53:13 crc kubenswrapper[4704]: I1125 15:53:13.852191 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="f25398c6-78b6-4b82-b3bf-ac1037e47998" containerName="swift-ring-rebalance" Nov 25 15:53:13 crc kubenswrapper[4704]: I1125 15:53:13.852318 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="f25398c6-78b6-4b82-b3bf-ac1037e47998" containerName="swift-ring-rebalance" Nov 25 15:53:13 crc kubenswrapper[4704]: I1125 15:53:13.853386 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt" Nov 25 15:53:13 crc kubenswrapper[4704]: I1125 15:53:13.856281 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-8zdtm" Nov 25 15:53:13 crc kubenswrapper[4704]: I1125 15:53:13.859890 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt"] Nov 25 15:53:13 crc kubenswrapper[4704]: I1125 15:53:13.933049 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6fwp\" (UniqueName: \"kubernetes.io/projected/7418c389-acf6-4fe8-b7be-b149451e186a-kube-api-access-d6fwp\") pod \"2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt\" (UID: \"7418c389-acf6-4fe8-b7be-b149451e186a\") " pod="openstack-operators/2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt" Nov 25 15:53:13 crc kubenswrapper[4704]: I1125 15:53:13.933111 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7418c389-acf6-4fe8-b7be-b149451e186a-util\") pod \"2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt\" (UID: \"7418c389-acf6-4fe8-b7be-b149451e186a\") " pod="openstack-operators/2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt" Nov 25 15:53:13 crc kubenswrapper[4704]: I1125 15:53:13.933333 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7418c389-acf6-4fe8-b7be-b149451e186a-bundle\") pod \"2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt\" (UID: \"7418c389-acf6-4fe8-b7be-b149451e186a\") " pod="openstack-operators/2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt" Nov 25 15:53:14 crc kubenswrapper[4704]: I1125 15:53:14.034485 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6fwp\" (UniqueName: \"kubernetes.io/projected/7418c389-acf6-4fe8-b7be-b149451e186a-kube-api-access-d6fwp\") pod \"2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt\" (UID: \"7418c389-acf6-4fe8-b7be-b149451e186a\") " pod="openstack-operators/2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt" Nov 25 15:53:14 crc kubenswrapper[4704]: I1125 15:53:14.034558 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7418c389-acf6-4fe8-b7be-b149451e186a-util\") pod \"2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt\" (UID: \"7418c389-acf6-4fe8-b7be-b149451e186a\") " pod="openstack-operators/2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt" Nov 25 15:53:14 crc kubenswrapper[4704]: I1125 15:53:14.034610 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7418c389-acf6-4fe8-b7be-b149451e186a-bundle\") pod \"2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt\" (UID: \"7418c389-acf6-4fe8-b7be-b149451e186a\") " pod="openstack-operators/2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt" Nov 25 15:53:14 crc kubenswrapper[4704]: I1125 15:53:14.035275 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7418c389-acf6-4fe8-b7be-b149451e186a-bundle\") pod \"2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt\" (UID: \"7418c389-acf6-4fe8-b7be-b149451e186a\") " pod="openstack-operators/2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt" Nov 25 15:53:14 crc kubenswrapper[4704]: I1125 15:53:14.035404 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7418c389-acf6-4fe8-b7be-b149451e186a-util\") pod \"2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt\" (UID: \"7418c389-acf6-4fe8-b7be-b149451e186a\") " pod="openstack-operators/2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt" Nov 25 15:53:14 crc kubenswrapper[4704]: I1125 15:53:14.055312 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6fwp\" (UniqueName: \"kubernetes.io/projected/7418c389-acf6-4fe8-b7be-b149451e186a-kube-api-access-d6fwp\") pod \"2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt\" (UID: \"7418c389-acf6-4fe8-b7be-b149451e186a\") " pod="openstack-operators/2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt" Nov 25 15:53:14 crc kubenswrapper[4704]: I1125 15:53:14.173034 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt" Nov 25 15:53:14 crc kubenswrapper[4704]: I1125 15:53:14.642101 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt"] Nov 25 15:53:14 crc kubenswrapper[4704]: W1125 15:53:14.649934 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7418c389_acf6_4fe8_b7be_b149451e186a.slice/crio-0bf7d2914260de7101c0d83b7eff614b955ed4a9b94ef0b2e314b34b770981e2 WatchSource:0}: Error finding container 0bf7d2914260de7101c0d83b7eff614b955ed4a9b94ef0b2e314b34b770981e2: Status 404 returned error can't find the container with id 0bf7d2914260de7101c0d83b7eff614b955ed4a9b94ef0b2e314b34b770981e2 Nov 25 15:53:15 crc kubenswrapper[4704]: I1125 15:53:15.482871 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt" event={"ID":"7418c389-acf6-4fe8-b7be-b149451e186a","Type":"ContainerStarted","Data":"0bf7d2914260de7101c0d83b7eff614b955ed4a9b94ef0b2e314b34b770981e2"} Nov 25 15:53:16 crc kubenswrapper[4704]: I1125 15:53:16.492976 4704 generic.go:334] "Generic (PLEG): container finished" podID="7418c389-acf6-4fe8-b7be-b149451e186a" containerID="a0712654acfd7b3993279d27016c533b8033550e665426d6e074f3908964873a" exitCode=0 Nov 25 15:53:16 crc kubenswrapper[4704]: I1125 15:53:16.493057 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt" event={"ID":"7418c389-acf6-4fe8-b7be-b149451e186a","Type":"ContainerDied","Data":"a0712654acfd7b3993279d27016c533b8033550e665426d6e074f3908964873a"} Nov 25 15:53:18 crc kubenswrapper[4704]: I1125 15:53:18.532009 4704 generic.go:334] "Generic (PLEG): container finished" podID="7418c389-acf6-4fe8-b7be-b149451e186a" containerID="0e4299fca9252e601975277281a1a0f7393044e94713cce7909aeb7b89c162d9" exitCode=0 Nov 25 15:53:18 crc kubenswrapper[4704]: I1125 15:53:18.532062 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt" event={"ID":"7418c389-acf6-4fe8-b7be-b149451e186a","Type":"ContainerDied","Data":"0e4299fca9252e601975277281a1a0f7393044e94713cce7909aeb7b89c162d9"} Nov 25 15:53:19 crc kubenswrapper[4704]: I1125 15:53:19.543744 4704 generic.go:334] "Generic (PLEG): container finished" podID="7418c389-acf6-4fe8-b7be-b149451e186a" containerID="0feaee02ff90846a04b7db63c6185c73e76487be552ab0ab4a703598697693c7" exitCode=0 Nov 25 15:53:19 crc kubenswrapper[4704]: I1125 15:53:19.543842 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt" event={"ID":"7418c389-acf6-4fe8-b7be-b149451e186a","Type":"ContainerDied","Data":"0feaee02ff90846a04b7db63c6185c73e76487be552ab0ab4a703598697693c7"} Nov 25 15:53:20 crc kubenswrapper[4704]: I1125 15:53:20.812898 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt" Nov 25 15:53:20 crc kubenswrapper[4704]: I1125 15:53:20.943386 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7418c389-acf6-4fe8-b7be-b149451e186a-bundle\") pod \"7418c389-acf6-4fe8-b7be-b149451e186a\" (UID: \"7418c389-acf6-4fe8-b7be-b149451e186a\") " Nov 25 15:53:20 crc kubenswrapper[4704]: I1125 15:53:20.943459 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6fwp\" (UniqueName: \"kubernetes.io/projected/7418c389-acf6-4fe8-b7be-b149451e186a-kube-api-access-d6fwp\") pod \"7418c389-acf6-4fe8-b7be-b149451e186a\" (UID: \"7418c389-acf6-4fe8-b7be-b149451e186a\") " Nov 25 15:53:20 crc kubenswrapper[4704]: I1125 15:53:20.943497 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7418c389-acf6-4fe8-b7be-b149451e186a-util\") pod \"7418c389-acf6-4fe8-b7be-b149451e186a\" (UID: \"7418c389-acf6-4fe8-b7be-b149451e186a\") " Nov 25 15:53:20 crc kubenswrapper[4704]: I1125 15:53:20.944590 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7418c389-acf6-4fe8-b7be-b149451e186a-bundle" (OuterVolumeSpecName: "bundle") pod "7418c389-acf6-4fe8-b7be-b149451e186a" (UID: "7418c389-acf6-4fe8-b7be-b149451e186a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:53:20 crc kubenswrapper[4704]: I1125 15:53:20.949856 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7418c389-acf6-4fe8-b7be-b149451e186a-kube-api-access-d6fwp" (OuterVolumeSpecName: "kube-api-access-d6fwp") pod "7418c389-acf6-4fe8-b7be-b149451e186a" (UID: "7418c389-acf6-4fe8-b7be-b149451e186a"). InnerVolumeSpecName "kube-api-access-d6fwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:53:20 crc kubenswrapper[4704]: I1125 15:53:20.958387 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7418c389-acf6-4fe8-b7be-b149451e186a-util" (OuterVolumeSpecName: "util") pod "7418c389-acf6-4fe8-b7be-b149451e186a" (UID: "7418c389-acf6-4fe8-b7be-b149451e186a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:53:21 crc kubenswrapper[4704]: I1125 15:53:21.045155 4704 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7418c389-acf6-4fe8-b7be-b149451e186a-util\") on node \"crc\" DevicePath \"\"" Nov 25 15:53:21 crc kubenswrapper[4704]: I1125 15:53:21.045201 4704 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7418c389-acf6-4fe8-b7be-b149451e186a-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:53:21 crc kubenswrapper[4704]: I1125 15:53:21.045216 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6fwp\" (UniqueName: \"kubernetes.io/projected/7418c389-acf6-4fe8-b7be-b149451e186a-kube-api-access-d6fwp\") on node \"crc\" DevicePath \"\"" Nov 25 15:53:21 crc kubenswrapper[4704]: I1125 15:53:21.559823 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt" event={"ID":"7418c389-acf6-4fe8-b7be-b149451e186a","Type":"ContainerDied","Data":"0bf7d2914260de7101c0d83b7eff614b955ed4a9b94ef0b2e314b34b770981e2"} Nov 25 15:53:21 crc kubenswrapper[4704]: I1125 15:53:21.559861 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bf7d2914260de7101c0d83b7eff614b955ed4a9b94ef0b2e314b34b770981e2" Nov 25 15:53:21 crc kubenswrapper[4704]: I1125 15:53:21.559906 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt" Nov 25 15:53:39 crc kubenswrapper[4704]: I1125 15:53:39.410309 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-9fd6d6f67-vlkl7"] Nov 25 15:53:39 crc kubenswrapper[4704]: E1125 15:53:39.412253 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7418c389-acf6-4fe8-b7be-b149451e186a" containerName="extract" Nov 25 15:53:39 crc kubenswrapper[4704]: I1125 15:53:39.412288 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="7418c389-acf6-4fe8-b7be-b149451e186a" containerName="extract" Nov 25 15:53:39 crc kubenswrapper[4704]: E1125 15:53:39.412306 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7418c389-acf6-4fe8-b7be-b149451e186a" containerName="pull" Nov 25 15:53:39 crc kubenswrapper[4704]: I1125 15:53:39.412313 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="7418c389-acf6-4fe8-b7be-b149451e186a" containerName="pull" Nov 25 15:53:39 crc kubenswrapper[4704]: E1125 15:53:39.412327 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7418c389-acf6-4fe8-b7be-b149451e186a" containerName="util" Nov 25 15:53:39 crc kubenswrapper[4704]: I1125 15:53:39.412334 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="7418c389-acf6-4fe8-b7be-b149451e186a" containerName="util" Nov 25 15:53:39 crc kubenswrapper[4704]: I1125 15:53:39.412529 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="7418c389-acf6-4fe8-b7be-b149451e186a" containerName="extract" Nov 25 15:53:39 crc kubenswrapper[4704]: I1125 15:53:39.413145 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-9fd6d6f67-vlkl7" Nov 25 15:53:39 crc kubenswrapper[4704]: I1125 15:53:39.415285 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-service-cert" Nov 25 15:53:39 crc kubenswrapper[4704]: I1125 15:53:39.415632 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-skhpv" Nov 25 15:53:39 crc kubenswrapper[4704]: I1125 15:53:39.434344 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-9fd6d6f67-vlkl7"] Nov 25 15:53:39 crc kubenswrapper[4704]: I1125 15:53:39.542363 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3649b0e8-675b-4b8f-9d4a-ff24b9edf553-webhook-cert\") pod \"glance-operator-controller-manager-9fd6d6f67-vlkl7\" (UID: \"3649b0e8-675b-4b8f-9d4a-ff24b9edf553\") " pod="openstack-operators/glance-operator-controller-manager-9fd6d6f67-vlkl7" Nov 25 15:53:39 crc kubenswrapper[4704]: I1125 15:53:39.542479 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmrg5\" (UniqueName: \"kubernetes.io/projected/3649b0e8-675b-4b8f-9d4a-ff24b9edf553-kube-api-access-fmrg5\") pod \"glance-operator-controller-manager-9fd6d6f67-vlkl7\" (UID: \"3649b0e8-675b-4b8f-9d4a-ff24b9edf553\") " pod="openstack-operators/glance-operator-controller-manager-9fd6d6f67-vlkl7" Nov 25 15:53:39 crc kubenswrapper[4704]: I1125 15:53:39.542566 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3649b0e8-675b-4b8f-9d4a-ff24b9edf553-apiservice-cert\") pod \"glance-operator-controller-manager-9fd6d6f67-vlkl7\" (UID: \"3649b0e8-675b-4b8f-9d4a-ff24b9edf553\") " pod="openstack-operators/glance-operator-controller-manager-9fd6d6f67-vlkl7" Nov 25 15:53:39 crc kubenswrapper[4704]: I1125 15:53:39.643675 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3649b0e8-675b-4b8f-9d4a-ff24b9edf553-webhook-cert\") pod \"glance-operator-controller-manager-9fd6d6f67-vlkl7\" (UID: \"3649b0e8-675b-4b8f-9d4a-ff24b9edf553\") " pod="openstack-operators/glance-operator-controller-manager-9fd6d6f67-vlkl7" Nov 25 15:53:39 crc kubenswrapper[4704]: I1125 15:53:39.644357 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmrg5\" (UniqueName: \"kubernetes.io/projected/3649b0e8-675b-4b8f-9d4a-ff24b9edf553-kube-api-access-fmrg5\") pod \"glance-operator-controller-manager-9fd6d6f67-vlkl7\" (UID: \"3649b0e8-675b-4b8f-9d4a-ff24b9edf553\") " pod="openstack-operators/glance-operator-controller-manager-9fd6d6f67-vlkl7" Nov 25 15:53:39 crc kubenswrapper[4704]: I1125 15:53:39.644457 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3649b0e8-675b-4b8f-9d4a-ff24b9edf553-apiservice-cert\") pod \"glance-operator-controller-manager-9fd6d6f67-vlkl7\" (UID: \"3649b0e8-675b-4b8f-9d4a-ff24b9edf553\") " pod="openstack-operators/glance-operator-controller-manager-9fd6d6f67-vlkl7" Nov 25 15:53:39 crc kubenswrapper[4704]: I1125 15:53:39.653057 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3649b0e8-675b-4b8f-9d4a-ff24b9edf553-apiservice-cert\") pod \"glance-operator-controller-manager-9fd6d6f67-vlkl7\" (UID: \"3649b0e8-675b-4b8f-9d4a-ff24b9edf553\") " pod="openstack-operators/glance-operator-controller-manager-9fd6d6f67-vlkl7" Nov 25 15:53:39 crc kubenswrapper[4704]: I1125 15:53:39.653438 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3649b0e8-675b-4b8f-9d4a-ff24b9edf553-webhook-cert\") pod \"glance-operator-controller-manager-9fd6d6f67-vlkl7\" (UID: \"3649b0e8-675b-4b8f-9d4a-ff24b9edf553\") " pod="openstack-operators/glance-operator-controller-manager-9fd6d6f67-vlkl7" Nov 25 15:53:39 crc kubenswrapper[4704]: I1125 15:53:39.665884 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmrg5\" (UniqueName: \"kubernetes.io/projected/3649b0e8-675b-4b8f-9d4a-ff24b9edf553-kube-api-access-fmrg5\") pod \"glance-operator-controller-manager-9fd6d6f67-vlkl7\" (UID: \"3649b0e8-675b-4b8f-9d4a-ff24b9edf553\") " pod="openstack-operators/glance-operator-controller-manager-9fd6d6f67-vlkl7" Nov 25 15:53:39 crc kubenswrapper[4704]: I1125 15:53:39.738865 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-9fd6d6f67-vlkl7" Nov 25 15:53:40 crc kubenswrapper[4704]: I1125 15:53:40.236320 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-9fd6d6f67-vlkl7"] Nov 25 15:53:40 crc kubenswrapper[4704]: I1125 15:53:40.716075 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-9fd6d6f67-vlkl7" event={"ID":"3649b0e8-675b-4b8f-9d4a-ff24b9edf553","Type":"ContainerStarted","Data":"0957334c3a253982f81273258a72f5b9b44e82b654bf7c9415dc5f9db0e1a05a"} Nov 25 15:53:42 crc kubenswrapper[4704]: I1125 15:53:42.740403 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-9fd6d6f67-vlkl7" event={"ID":"3649b0e8-675b-4b8f-9d4a-ff24b9edf553","Type":"ContainerStarted","Data":"be45a67e896a306957a35244a4a0fa786326970933f29c53e1bb0f5d36e76d1e"} Nov 25 15:53:42 crc kubenswrapper[4704]: I1125 15:53:42.741249 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-9fd6d6f67-vlkl7" Nov 25 15:53:42 crc kubenswrapper[4704]: I1125 15:53:42.757659 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-9fd6d6f67-vlkl7" podStartSLOduration=2.049494383 podStartE2EDuration="3.757606204s" podCreationTimestamp="2025-11-25 15:53:39 +0000 UTC" firstStartedPulling="2025-11-25 15:53:40.260011298 +0000 UTC m=+1106.528285079" lastFinishedPulling="2025-11-25 15:53:41.968123129 +0000 UTC m=+1108.236396900" observedRunningTime="2025-11-25 15:53:42.757053878 +0000 UTC m=+1109.025327689" watchObservedRunningTime="2025-11-25 15:53:42.757606204 +0000 UTC m=+1109.025879985" Nov 25 15:53:49 crc kubenswrapper[4704]: I1125 15:53:49.743969 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-9fd6d6f67-vlkl7" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.397245 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-6nxcq"] Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.398806 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-6nxcq" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.414599 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-6nxcq"] Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.490763 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstackclient"] Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.491822 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.493935 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"default-dockercfg-sp8pv" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.494355 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.495216 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-config-secret" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.495945 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts-9db6gc427h" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.500525 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.505473 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/daf1e165-d977-4aa8-a214-d9fa6552ec18-operator-scripts\") pod \"glance-db-create-6nxcq\" (UID: \"daf1e165-d977-4aa8-a214-d9fa6552ec18\") " pod="glance-kuttl-tests/glance-db-create-6nxcq" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.505593 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mz9z\" (UniqueName: \"kubernetes.io/projected/daf1e165-d977-4aa8-a214-d9fa6552ec18-kube-api-access-8mz9z\") pod \"glance-db-create-6nxcq\" (UID: \"daf1e165-d977-4aa8-a214-d9fa6552ec18\") " pod="glance-kuttl-tests/glance-db-create-6nxcq" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.601280 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-50cb-account-create-update-t2pzh"] Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.602315 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-50cb-account-create-update-t2pzh" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.604348 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.606420 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5f5fd1d0-8bba-442f-a35c-15cfcf49a92f-openstack-config\") pod \"openstackclient\" (UID: \"5f5fd1d0-8bba-442f-a35c-15cfcf49a92f\") " pod="glance-kuttl-tests/openstackclient" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.606501 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/5f5fd1d0-8bba-442f-a35c-15cfcf49a92f-openstack-scripts\") pod \"openstackclient\" (UID: \"5f5fd1d0-8bba-442f-a35c-15cfcf49a92f\") " pod="glance-kuttl-tests/openstackclient" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.606539 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/daf1e165-d977-4aa8-a214-d9fa6552ec18-operator-scripts\") pod \"glance-db-create-6nxcq\" (UID: \"daf1e165-d977-4aa8-a214-d9fa6552ec18\") " pod="glance-kuttl-tests/glance-db-create-6nxcq" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.606570 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdjlr\" (UniqueName: \"kubernetes.io/projected/5f5fd1d0-8bba-442f-a35c-15cfcf49a92f-kube-api-access-tdjlr\") pod \"openstackclient\" (UID: \"5f5fd1d0-8bba-442f-a35c-15cfcf49a92f\") " pod="glance-kuttl-tests/openstackclient" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.606599 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5f5fd1d0-8bba-442f-a35c-15cfcf49a92f-openstack-config-secret\") pod \"openstackclient\" (UID: \"5f5fd1d0-8bba-442f-a35c-15cfcf49a92f\") " pod="glance-kuttl-tests/openstackclient" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.606642 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mz9z\" (UniqueName: \"kubernetes.io/projected/daf1e165-d977-4aa8-a214-d9fa6552ec18-kube-api-access-8mz9z\") pod \"glance-db-create-6nxcq\" (UID: \"daf1e165-d977-4aa8-a214-d9fa6552ec18\") " pod="glance-kuttl-tests/glance-db-create-6nxcq" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.607288 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/daf1e165-d977-4aa8-a214-d9fa6552ec18-operator-scripts\") pod \"glance-db-create-6nxcq\" (UID: \"daf1e165-d977-4aa8-a214-d9fa6552ec18\") " pod="glance-kuttl-tests/glance-db-create-6nxcq" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.614290 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-50cb-account-create-update-t2pzh"] Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.633370 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mz9z\" (UniqueName: \"kubernetes.io/projected/daf1e165-d977-4aa8-a214-d9fa6552ec18-kube-api-access-8mz9z\") pod \"glance-db-create-6nxcq\" (UID: \"daf1e165-d977-4aa8-a214-d9fa6552ec18\") " pod="glance-kuttl-tests/glance-db-create-6nxcq" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.707966 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7391f8a5-e178-4c42-9845-015504a91779-operator-scripts\") pod \"glance-50cb-account-create-update-t2pzh\" (UID: \"7391f8a5-e178-4c42-9845-015504a91779\") " pod="glance-kuttl-tests/glance-50cb-account-create-update-t2pzh" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.708034 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtdq2\" (UniqueName: \"kubernetes.io/projected/7391f8a5-e178-4c42-9845-015504a91779-kube-api-access-dtdq2\") pod \"glance-50cb-account-create-update-t2pzh\" (UID: \"7391f8a5-e178-4c42-9845-015504a91779\") " pod="glance-kuttl-tests/glance-50cb-account-create-update-t2pzh" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.708088 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5f5fd1d0-8bba-442f-a35c-15cfcf49a92f-openstack-config\") pod \"openstackclient\" (UID: \"5f5fd1d0-8bba-442f-a35c-15cfcf49a92f\") " pod="glance-kuttl-tests/openstackclient" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.708154 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/5f5fd1d0-8bba-442f-a35c-15cfcf49a92f-openstack-scripts\") pod \"openstackclient\" (UID: \"5f5fd1d0-8bba-442f-a35c-15cfcf49a92f\") " pod="glance-kuttl-tests/openstackclient" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.708186 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdjlr\" (UniqueName: \"kubernetes.io/projected/5f5fd1d0-8bba-442f-a35c-15cfcf49a92f-kube-api-access-tdjlr\") pod \"openstackclient\" (UID: \"5f5fd1d0-8bba-442f-a35c-15cfcf49a92f\") " pod="glance-kuttl-tests/openstackclient" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.708208 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5f5fd1d0-8bba-442f-a35c-15cfcf49a92f-openstack-config-secret\") pod \"openstackclient\" (UID: \"5f5fd1d0-8bba-442f-a35c-15cfcf49a92f\") " pod="glance-kuttl-tests/openstackclient" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.709193 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5f5fd1d0-8bba-442f-a35c-15cfcf49a92f-openstack-config\") pod \"openstackclient\" (UID: \"5f5fd1d0-8bba-442f-a35c-15cfcf49a92f\") " pod="glance-kuttl-tests/openstackclient" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.709736 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/5f5fd1d0-8bba-442f-a35c-15cfcf49a92f-openstack-scripts\") pod \"openstackclient\" (UID: \"5f5fd1d0-8bba-442f-a35c-15cfcf49a92f\") " pod="glance-kuttl-tests/openstackclient" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.715396 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5f5fd1d0-8bba-442f-a35c-15cfcf49a92f-openstack-config-secret\") pod \"openstackclient\" (UID: \"5f5fd1d0-8bba-442f-a35c-15cfcf49a92f\") " pod="glance-kuttl-tests/openstackclient" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.719124 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-6nxcq" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.732891 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdjlr\" (UniqueName: \"kubernetes.io/projected/5f5fd1d0-8bba-442f-a35c-15cfcf49a92f-kube-api-access-tdjlr\") pod \"openstackclient\" (UID: \"5f5fd1d0-8bba-442f-a35c-15cfcf49a92f\") " pod="glance-kuttl-tests/openstackclient" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.810073 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7391f8a5-e178-4c42-9845-015504a91779-operator-scripts\") pod \"glance-50cb-account-create-update-t2pzh\" (UID: \"7391f8a5-e178-4c42-9845-015504a91779\") " pod="glance-kuttl-tests/glance-50cb-account-create-update-t2pzh" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.810510 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtdq2\" (UniqueName: \"kubernetes.io/projected/7391f8a5-e178-4c42-9845-015504a91779-kube-api-access-dtdq2\") pod \"glance-50cb-account-create-update-t2pzh\" (UID: \"7391f8a5-e178-4c42-9845-015504a91779\") " pod="glance-kuttl-tests/glance-50cb-account-create-update-t2pzh" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.810978 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7391f8a5-e178-4c42-9845-015504a91779-operator-scripts\") pod \"glance-50cb-account-create-update-t2pzh\" (UID: \"7391f8a5-e178-4c42-9845-015504a91779\") " pod="glance-kuttl-tests/glance-50cb-account-create-update-t2pzh" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.812253 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.827724 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtdq2\" (UniqueName: \"kubernetes.io/projected/7391f8a5-e178-4c42-9845-015504a91779-kube-api-access-dtdq2\") pod \"glance-50cb-account-create-update-t2pzh\" (UID: \"7391f8a5-e178-4c42-9845-015504a91779\") " pod="glance-kuttl-tests/glance-50cb-account-create-update-t2pzh" Nov 25 15:53:51 crc kubenswrapper[4704]: I1125 15:53:51.918107 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-50cb-account-create-update-t2pzh" Nov 25 15:53:52 crc kubenswrapper[4704]: I1125 15:53:52.208201 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-6nxcq"] Nov 25 15:53:52 crc kubenswrapper[4704]: I1125 15:53:52.322205 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Nov 25 15:53:52 crc kubenswrapper[4704]: I1125 15:53:52.456364 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-50cb-account-create-update-t2pzh"] Nov 25 15:53:52 crc kubenswrapper[4704]: W1125 15:53:52.462081 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7391f8a5_e178_4c42_9845_015504a91779.slice/crio-d676d22d31c19944a1b17a610760d3d77c5efdeb8760617d995cdb94c78f4ad5 WatchSource:0}: Error finding container d676d22d31c19944a1b17a610760d3d77c5efdeb8760617d995cdb94c78f4ad5: Status 404 returned error can't find the container with id d676d22d31c19944a1b17a610760d3d77c5efdeb8760617d995cdb94c78f4ad5 Nov 25 15:53:52 crc kubenswrapper[4704]: I1125 15:53:52.814101 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-50cb-account-create-update-t2pzh" event={"ID":"7391f8a5-e178-4c42-9845-015504a91779","Type":"ContainerStarted","Data":"d9fa378769c2cf156dab36ce579906fa433f83b060ba3c1ebc27a5daba6cfdf8"} Nov 25 15:53:52 crc kubenswrapper[4704]: I1125 15:53:52.814585 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-50cb-account-create-update-t2pzh" event={"ID":"7391f8a5-e178-4c42-9845-015504a91779","Type":"ContainerStarted","Data":"d676d22d31c19944a1b17a610760d3d77c5efdeb8760617d995cdb94c78f4ad5"} Nov 25 15:53:52 crc kubenswrapper[4704]: I1125 15:53:52.816453 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"5f5fd1d0-8bba-442f-a35c-15cfcf49a92f","Type":"ContainerStarted","Data":"b9961e751f697547cac820249b7bc822fc8f48f963710d94e1f0cf9f7b824f37"} Nov 25 15:53:52 crc kubenswrapper[4704]: I1125 15:53:52.818467 4704 generic.go:334] "Generic (PLEG): container finished" podID="daf1e165-d977-4aa8-a214-d9fa6552ec18" containerID="edde44852ee9a22b0b119321083e622f279f11ec499478e232c9c78c0c5b4909" exitCode=0 Nov 25 15:53:52 crc kubenswrapper[4704]: I1125 15:53:52.818502 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-6nxcq" event={"ID":"daf1e165-d977-4aa8-a214-d9fa6552ec18","Type":"ContainerDied","Data":"edde44852ee9a22b0b119321083e622f279f11ec499478e232c9c78c0c5b4909"} Nov 25 15:53:52 crc kubenswrapper[4704]: I1125 15:53:52.818519 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-6nxcq" event={"ID":"daf1e165-d977-4aa8-a214-d9fa6552ec18","Type":"ContainerStarted","Data":"87fed5a3d450f8142180c5b980270a5248df3e717e97b098c16288a246493f93"} Nov 25 15:53:52 crc kubenswrapper[4704]: I1125 15:53:52.832270 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-50cb-account-create-update-t2pzh" podStartSLOduration=1.832252128 podStartE2EDuration="1.832252128s" podCreationTimestamp="2025-11-25 15:53:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:53:52.832235697 +0000 UTC m=+1119.100509478" watchObservedRunningTime="2025-11-25 15:53:52.832252128 +0000 UTC m=+1119.100525909" Nov 25 15:53:53 crc kubenswrapper[4704]: I1125 15:53:53.841436 4704 generic.go:334] "Generic (PLEG): container finished" podID="7391f8a5-e178-4c42-9845-015504a91779" containerID="d9fa378769c2cf156dab36ce579906fa433f83b060ba3c1ebc27a5daba6cfdf8" exitCode=0 Nov 25 15:53:53 crc kubenswrapper[4704]: I1125 15:53:53.842557 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-50cb-account-create-update-t2pzh" event={"ID":"7391f8a5-e178-4c42-9845-015504a91779","Type":"ContainerDied","Data":"d9fa378769c2cf156dab36ce579906fa433f83b060ba3c1ebc27a5daba6cfdf8"} Nov 25 15:53:54 crc kubenswrapper[4704]: I1125 15:53:54.219006 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-6nxcq" Nov 25 15:53:54 crc kubenswrapper[4704]: I1125 15:53:54.353830 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mz9z\" (UniqueName: \"kubernetes.io/projected/daf1e165-d977-4aa8-a214-d9fa6552ec18-kube-api-access-8mz9z\") pod \"daf1e165-d977-4aa8-a214-d9fa6552ec18\" (UID: \"daf1e165-d977-4aa8-a214-d9fa6552ec18\") " Nov 25 15:53:54 crc kubenswrapper[4704]: I1125 15:53:54.354398 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/daf1e165-d977-4aa8-a214-d9fa6552ec18-operator-scripts\") pod \"daf1e165-d977-4aa8-a214-d9fa6552ec18\" (UID: \"daf1e165-d977-4aa8-a214-d9fa6552ec18\") " Nov 25 15:53:54 crc kubenswrapper[4704]: I1125 15:53:54.355175 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/daf1e165-d977-4aa8-a214-d9fa6552ec18-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "daf1e165-d977-4aa8-a214-d9fa6552ec18" (UID: "daf1e165-d977-4aa8-a214-d9fa6552ec18"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:53:54 crc kubenswrapper[4704]: I1125 15:53:54.361903 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daf1e165-d977-4aa8-a214-d9fa6552ec18-kube-api-access-8mz9z" (OuterVolumeSpecName: "kube-api-access-8mz9z") pod "daf1e165-d977-4aa8-a214-d9fa6552ec18" (UID: "daf1e165-d977-4aa8-a214-d9fa6552ec18"). InnerVolumeSpecName "kube-api-access-8mz9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:53:54 crc kubenswrapper[4704]: I1125 15:53:54.456190 4704 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/daf1e165-d977-4aa8-a214-d9fa6552ec18-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:53:54 crc kubenswrapper[4704]: I1125 15:53:54.456234 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mz9z\" (UniqueName: \"kubernetes.io/projected/daf1e165-d977-4aa8-a214-d9fa6552ec18-kube-api-access-8mz9z\") on node \"crc\" DevicePath \"\"" Nov 25 15:53:54 crc kubenswrapper[4704]: I1125 15:53:54.854547 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-6nxcq" event={"ID":"daf1e165-d977-4aa8-a214-d9fa6552ec18","Type":"ContainerDied","Data":"87fed5a3d450f8142180c5b980270a5248df3e717e97b098c16288a246493f93"} Nov 25 15:53:54 crc kubenswrapper[4704]: I1125 15:53:54.854605 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87fed5a3d450f8142180c5b980270a5248df3e717e97b098c16288a246493f93" Nov 25 15:53:54 crc kubenswrapper[4704]: I1125 15:53:54.854619 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-6nxcq" Nov 25 15:53:55 crc kubenswrapper[4704]: I1125 15:53:55.163582 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-50cb-account-create-update-t2pzh" Nov 25 15:53:55 crc kubenswrapper[4704]: I1125 15:53:55.268037 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7391f8a5-e178-4c42-9845-015504a91779-operator-scripts\") pod \"7391f8a5-e178-4c42-9845-015504a91779\" (UID: \"7391f8a5-e178-4c42-9845-015504a91779\") " Nov 25 15:53:55 crc kubenswrapper[4704]: I1125 15:53:55.268156 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtdq2\" (UniqueName: \"kubernetes.io/projected/7391f8a5-e178-4c42-9845-015504a91779-kube-api-access-dtdq2\") pod \"7391f8a5-e178-4c42-9845-015504a91779\" (UID: \"7391f8a5-e178-4c42-9845-015504a91779\") " Nov 25 15:53:55 crc kubenswrapper[4704]: I1125 15:53:55.270200 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7391f8a5-e178-4c42-9845-015504a91779-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7391f8a5-e178-4c42-9845-015504a91779" (UID: "7391f8a5-e178-4c42-9845-015504a91779"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:53:55 crc kubenswrapper[4704]: I1125 15:53:55.283534 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7391f8a5-e178-4c42-9845-015504a91779-kube-api-access-dtdq2" (OuterVolumeSpecName: "kube-api-access-dtdq2") pod "7391f8a5-e178-4c42-9845-015504a91779" (UID: "7391f8a5-e178-4c42-9845-015504a91779"). InnerVolumeSpecName "kube-api-access-dtdq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:53:55 crc kubenswrapper[4704]: I1125 15:53:55.370143 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtdq2\" (UniqueName: \"kubernetes.io/projected/7391f8a5-e178-4c42-9845-015504a91779-kube-api-access-dtdq2\") on node \"crc\" DevicePath \"\"" Nov 25 15:53:55 crc kubenswrapper[4704]: I1125 15:53:55.370180 4704 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7391f8a5-e178-4c42-9845-015504a91779-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:53:55 crc kubenswrapper[4704]: I1125 15:53:55.867605 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-50cb-account-create-update-t2pzh" event={"ID":"7391f8a5-e178-4c42-9845-015504a91779","Type":"ContainerDied","Data":"d676d22d31c19944a1b17a610760d3d77c5efdeb8760617d995cdb94c78f4ad5"} Nov 25 15:53:55 crc kubenswrapper[4704]: I1125 15:53:55.867657 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d676d22d31c19944a1b17a610760d3d77c5efdeb8760617d995cdb94c78f4ad5" Nov 25 15:53:55 crc kubenswrapper[4704]: I1125 15:53:55.867723 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-50cb-account-create-update-t2pzh" Nov 25 15:53:56 crc kubenswrapper[4704]: I1125 15:53:56.718439 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-m7kk7"] Nov 25 15:53:56 crc kubenswrapper[4704]: E1125 15:53:56.718907 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daf1e165-d977-4aa8-a214-d9fa6552ec18" containerName="mariadb-database-create" Nov 25 15:53:56 crc kubenswrapper[4704]: I1125 15:53:56.718924 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="daf1e165-d977-4aa8-a214-d9fa6552ec18" containerName="mariadb-database-create" Nov 25 15:53:56 crc kubenswrapper[4704]: E1125 15:53:56.718933 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7391f8a5-e178-4c42-9845-015504a91779" containerName="mariadb-account-create-update" Nov 25 15:53:56 crc kubenswrapper[4704]: I1125 15:53:56.718939 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="7391f8a5-e178-4c42-9845-015504a91779" containerName="mariadb-account-create-update" Nov 25 15:53:56 crc kubenswrapper[4704]: I1125 15:53:56.719090 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="daf1e165-d977-4aa8-a214-d9fa6552ec18" containerName="mariadb-database-create" Nov 25 15:53:56 crc kubenswrapper[4704]: I1125 15:53:56.719104 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="7391f8a5-e178-4c42-9845-015504a91779" containerName="mariadb-account-create-update" Nov 25 15:53:56 crc kubenswrapper[4704]: I1125 15:53:56.719577 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-m7kk7" Nov 25 15:53:56 crc kubenswrapper[4704]: I1125 15:53:56.723702 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Nov 25 15:53:56 crc kubenswrapper[4704]: I1125 15:53:56.723888 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-6sqgt" Nov 25 15:53:56 crc kubenswrapper[4704]: I1125 15:53:56.746504 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-m7kk7"] Nov 25 15:53:56 crc kubenswrapper[4704]: I1125 15:53:56.794190 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0d727c8-1b1e-4a10-ad2b-7500a2d78d44-config-data\") pod \"glance-db-sync-m7kk7\" (UID: \"b0d727c8-1b1e-4a10-ad2b-7500a2d78d44\") " pod="glance-kuttl-tests/glance-db-sync-m7kk7" Nov 25 15:53:56 crc kubenswrapper[4704]: I1125 15:53:56.794291 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbkch\" (UniqueName: \"kubernetes.io/projected/b0d727c8-1b1e-4a10-ad2b-7500a2d78d44-kube-api-access-zbkch\") pod \"glance-db-sync-m7kk7\" (UID: \"b0d727c8-1b1e-4a10-ad2b-7500a2d78d44\") " pod="glance-kuttl-tests/glance-db-sync-m7kk7" Nov 25 15:53:56 crc kubenswrapper[4704]: I1125 15:53:56.794313 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b0d727c8-1b1e-4a10-ad2b-7500a2d78d44-db-sync-config-data\") pod \"glance-db-sync-m7kk7\" (UID: \"b0d727c8-1b1e-4a10-ad2b-7500a2d78d44\") " pod="glance-kuttl-tests/glance-db-sync-m7kk7" Nov 25 15:53:56 crc kubenswrapper[4704]: I1125 15:53:56.896089 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0d727c8-1b1e-4a10-ad2b-7500a2d78d44-config-data\") pod \"glance-db-sync-m7kk7\" (UID: \"b0d727c8-1b1e-4a10-ad2b-7500a2d78d44\") " pod="glance-kuttl-tests/glance-db-sync-m7kk7" Nov 25 15:53:56 crc kubenswrapper[4704]: I1125 15:53:56.896168 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbkch\" (UniqueName: \"kubernetes.io/projected/b0d727c8-1b1e-4a10-ad2b-7500a2d78d44-kube-api-access-zbkch\") pod \"glance-db-sync-m7kk7\" (UID: \"b0d727c8-1b1e-4a10-ad2b-7500a2d78d44\") " pod="glance-kuttl-tests/glance-db-sync-m7kk7" Nov 25 15:53:56 crc kubenswrapper[4704]: I1125 15:53:56.896189 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b0d727c8-1b1e-4a10-ad2b-7500a2d78d44-db-sync-config-data\") pod \"glance-db-sync-m7kk7\" (UID: \"b0d727c8-1b1e-4a10-ad2b-7500a2d78d44\") " pod="glance-kuttl-tests/glance-db-sync-m7kk7" Nov 25 15:53:56 crc kubenswrapper[4704]: I1125 15:53:56.901579 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b0d727c8-1b1e-4a10-ad2b-7500a2d78d44-db-sync-config-data\") pod \"glance-db-sync-m7kk7\" (UID: \"b0d727c8-1b1e-4a10-ad2b-7500a2d78d44\") " pod="glance-kuttl-tests/glance-db-sync-m7kk7" Nov 25 15:53:56 crc kubenswrapper[4704]: I1125 15:53:56.908243 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0d727c8-1b1e-4a10-ad2b-7500a2d78d44-config-data\") pod \"glance-db-sync-m7kk7\" (UID: \"b0d727c8-1b1e-4a10-ad2b-7500a2d78d44\") " pod="glance-kuttl-tests/glance-db-sync-m7kk7" Nov 25 15:53:56 crc kubenswrapper[4704]: I1125 15:53:56.921038 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbkch\" (UniqueName: \"kubernetes.io/projected/b0d727c8-1b1e-4a10-ad2b-7500a2d78d44-kube-api-access-zbkch\") pod \"glance-db-sync-m7kk7\" (UID: \"b0d727c8-1b1e-4a10-ad2b-7500a2d78d44\") " pod="glance-kuttl-tests/glance-db-sync-m7kk7" Nov 25 15:53:57 crc kubenswrapper[4704]: I1125 15:53:57.037987 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-m7kk7" Nov 25 15:54:01 crc kubenswrapper[4704]: I1125 15:54:01.925931 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"5f5fd1d0-8bba-442f-a35c-15cfcf49a92f","Type":"ContainerStarted","Data":"d19757ffcc5355812e84aeccf0d3a2bf4b6b587c36b7dd634eb55bdbf36104a3"} Nov 25 15:54:01 crc kubenswrapper[4704]: I1125 15:54:01.945209 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstackclient" podStartSLOduration=1.584225137 podStartE2EDuration="10.94518961s" podCreationTimestamp="2025-11-25 15:53:51 +0000 UTC" firstStartedPulling="2025-11-25 15:53:52.325093315 +0000 UTC m=+1118.593367096" lastFinishedPulling="2025-11-25 15:54:01.686057788 +0000 UTC m=+1127.954331569" observedRunningTime="2025-11-25 15:54:01.943931804 +0000 UTC m=+1128.212205605" watchObservedRunningTime="2025-11-25 15:54:01.94518961 +0000 UTC m=+1128.213463401" Nov 25 15:54:02 crc kubenswrapper[4704]: I1125 15:54:02.033197 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-m7kk7"] Nov 25 15:54:02 crc kubenswrapper[4704]: I1125 15:54:02.936709 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-m7kk7" event={"ID":"b0d727c8-1b1e-4a10-ad2b-7500a2d78d44","Type":"ContainerStarted","Data":"4269586c15b0c33688a75b2c44b76fb90281d62034fa9f84ec446c4b5fe8bf0d"} Nov 25 15:54:15 crc kubenswrapper[4704]: I1125 15:54:15.035503 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-m7kk7" event={"ID":"b0d727c8-1b1e-4a10-ad2b-7500a2d78d44","Type":"ContainerStarted","Data":"09142a0aee28f2c84807a42113649ee1fcd77643722796f0dcd1c61f3772e057"} Nov 25 15:54:15 crc kubenswrapper[4704]: I1125 15:54:15.063927 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-m7kk7" podStartSLOduration=7.327933801 podStartE2EDuration="19.063898162s" podCreationTimestamp="2025-11-25 15:53:56 +0000 UTC" firstStartedPulling="2025-11-25 15:54:02.047058329 +0000 UTC m=+1128.315332110" lastFinishedPulling="2025-11-25 15:54:13.78302269 +0000 UTC m=+1140.051296471" observedRunningTime="2025-11-25 15:54:15.055726485 +0000 UTC m=+1141.324000266" watchObservedRunningTime="2025-11-25 15:54:15.063898162 +0000 UTC m=+1141.332171953" Nov 25 15:54:26 crc kubenswrapper[4704]: I1125 15:54:26.290694 4704 generic.go:334] "Generic (PLEG): container finished" podID="b0d727c8-1b1e-4a10-ad2b-7500a2d78d44" containerID="09142a0aee28f2c84807a42113649ee1fcd77643722796f0dcd1c61f3772e057" exitCode=0 Nov 25 15:54:26 crc kubenswrapper[4704]: I1125 15:54:26.290822 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-m7kk7" event={"ID":"b0d727c8-1b1e-4a10-ad2b-7500a2d78d44","Type":"ContainerDied","Data":"09142a0aee28f2c84807a42113649ee1fcd77643722796f0dcd1c61f3772e057"} Nov 25 15:54:27 crc kubenswrapper[4704]: I1125 15:54:27.562064 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-m7kk7" Nov 25 15:54:27 crc kubenswrapper[4704]: I1125 15:54:27.701316 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbkch\" (UniqueName: \"kubernetes.io/projected/b0d727c8-1b1e-4a10-ad2b-7500a2d78d44-kube-api-access-zbkch\") pod \"b0d727c8-1b1e-4a10-ad2b-7500a2d78d44\" (UID: \"b0d727c8-1b1e-4a10-ad2b-7500a2d78d44\") " Nov 25 15:54:27 crc kubenswrapper[4704]: I1125 15:54:27.701479 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b0d727c8-1b1e-4a10-ad2b-7500a2d78d44-db-sync-config-data\") pod \"b0d727c8-1b1e-4a10-ad2b-7500a2d78d44\" (UID: \"b0d727c8-1b1e-4a10-ad2b-7500a2d78d44\") " Nov 25 15:54:27 crc kubenswrapper[4704]: I1125 15:54:27.701496 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0d727c8-1b1e-4a10-ad2b-7500a2d78d44-config-data\") pod \"b0d727c8-1b1e-4a10-ad2b-7500a2d78d44\" (UID: \"b0d727c8-1b1e-4a10-ad2b-7500a2d78d44\") " Nov 25 15:54:27 crc kubenswrapper[4704]: I1125 15:54:27.711084 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d727c8-1b1e-4a10-ad2b-7500a2d78d44-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b0d727c8-1b1e-4a10-ad2b-7500a2d78d44" (UID: "b0d727c8-1b1e-4a10-ad2b-7500a2d78d44"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:54:27 crc kubenswrapper[4704]: I1125 15:54:27.711150 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0d727c8-1b1e-4a10-ad2b-7500a2d78d44-kube-api-access-zbkch" (OuterVolumeSpecName: "kube-api-access-zbkch") pod "b0d727c8-1b1e-4a10-ad2b-7500a2d78d44" (UID: "b0d727c8-1b1e-4a10-ad2b-7500a2d78d44"). InnerVolumeSpecName "kube-api-access-zbkch". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:54:27 crc kubenswrapper[4704]: I1125 15:54:27.749061 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d727c8-1b1e-4a10-ad2b-7500a2d78d44-config-data" (OuterVolumeSpecName: "config-data") pod "b0d727c8-1b1e-4a10-ad2b-7500a2d78d44" (UID: "b0d727c8-1b1e-4a10-ad2b-7500a2d78d44"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:54:27 crc kubenswrapper[4704]: I1125 15:54:27.803018 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbkch\" (UniqueName: \"kubernetes.io/projected/b0d727c8-1b1e-4a10-ad2b-7500a2d78d44-kube-api-access-zbkch\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:27 crc kubenswrapper[4704]: I1125 15:54:27.803069 4704 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b0d727c8-1b1e-4a10-ad2b-7500a2d78d44-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:27 crc kubenswrapper[4704]: I1125 15:54:27.803079 4704 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0d727c8-1b1e-4a10-ad2b-7500a2d78d44-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:28 crc kubenswrapper[4704]: I1125 15:54:28.308276 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-m7kk7" event={"ID":"b0d727c8-1b1e-4a10-ad2b-7500a2d78d44","Type":"ContainerDied","Data":"4269586c15b0c33688a75b2c44b76fb90281d62034fa9f84ec446c4b5fe8bf0d"} Nov 25 15:54:28 crc kubenswrapper[4704]: I1125 15:54:28.308330 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4269586c15b0c33688a75b2c44b76fb90281d62034fa9f84ec446c4b5fe8bf0d" Nov 25 15:54:28 crc kubenswrapper[4704]: I1125 15:54:28.308693 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-m7kk7" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.676531 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 25 15:54:29 crc kubenswrapper[4704]: E1125 15:54:29.677499 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0d727c8-1b1e-4a10-ad2b-7500a2d78d44" containerName="glance-db-sync" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.677520 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d727c8-1b1e-4a10-ad2b-7500a2d78d44" containerName="glance-db-sync" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.677691 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0d727c8-1b1e-4a10-ad2b-7500a2d78d44" containerName="glance-db-sync" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.679243 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.681372 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.682614 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-6sqgt" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.682690 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.689990 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.701369 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.703417 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.727740 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.736689 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c5825a-c141-4f0f-84bf-b562a5decca3-config-data\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.736751 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.736782 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41c5825a-c141-4f0f-84bf-b562a5decca3-logs\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.736830 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/41c5825a-c141-4f0f-84bf-b562a5decca3-httpd-run\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.736881 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-dev\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.736965 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.737040 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hm84\" (UniqueName: \"kubernetes.io/projected/41c5825a-c141-4f0f-84bf-b562a5decca3-kube-api-access-7hm84\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.737065 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-lib-modules\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.737100 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-sys\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.737175 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-etc-nvme\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.737206 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-run\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.737234 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.737278 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41c5825a-c141-4f0f-84bf-b562a5decca3-scripts\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.737310 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.839111 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-sys\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.839962 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.840127 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-etc-nvme\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.840236 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-run\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.840354 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-etc-nvme\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.840355 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-run\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.840373 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.839256 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-sys\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.840466 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.840502 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41c5825a-c141-4f0f-84bf-b562a5decca3-scripts\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.840525 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-etc-nvme\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.840539 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-logs\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.840563 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.840580 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.840633 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c5825a-c141-4f0f-84bf-b562a5decca3-config-data\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.840657 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.840673 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41c5825a-c141-4f0f-84bf-b562a5decca3-logs\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.840693 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/41c5825a-c141-4f0f-84bf-b562a5decca3-httpd-run\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.840709 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.840722 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-dev\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.840754 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-dev\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.840777 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-scripts\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.840818 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-httpd-run\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.840854 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-sys\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.840882 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glqwd\" (UniqueName: \"kubernetes.io/projected/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-kube-api-access-glqwd\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.840917 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-lib-modules\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.840947 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.840986 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-config-data\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.841040 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hm84\" (UniqueName: \"kubernetes.io/projected/41c5825a-c141-4f0f-84bf-b562a5decca3-kube-api-access-7hm84\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.841066 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-lib-modules\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.841097 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-run\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.841437 4704 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.841752 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-lib-modules\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.841807 4704 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.842025 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-dev\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.842191 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.842426 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/41c5825a-c141-4f0f-84bf-b562a5decca3-httpd-run\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.842499 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41c5825a-c141-4f0f-84bf-b562a5decca3-logs\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.843164 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.849085 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c5825a-c141-4f0f-84bf-b562a5decca3-config-data\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.857032 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41c5825a-c141-4f0f-84bf-b562a5decca3-scripts\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.858564 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hm84\" (UniqueName: \"kubernetes.io/projected/41c5825a-c141-4f0f-84bf-b562a5decca3-kube-api-access-7hm84\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.864028 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.868511 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-0\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.942830 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.942882 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-etc-nvme\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.942899 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-logs\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.942921 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.942959 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.942973 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-dev\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.942993 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-scripts\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.943007 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-httpd-run\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.943030 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-sys\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.943044 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glqwd\" (UniqueName: \"kubernetes.io/projected/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-kube-api-access-glqwd\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.943065 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-lib-modules\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.943089 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-config-data\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.943119 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-run\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.943141 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.943321 4704 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.944233 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-run\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.944367 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.944315 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.944392 4704 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") device mount path \"/mnt/openstack/pv10\"" pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.944432 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-sys\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.944480 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-etc-nvme\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.944504 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-dev\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.944441 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-lib-modules\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.946892 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-logs\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.946978 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-httpd-run\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.948484 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-scripts\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.949315 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-config-data\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.961306 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glqwd\" (UniqueName: \"kubernetes.io/projected/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-kube-api-access-glqwd\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.962492 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:29 crc kubenswrapper[4704]: I1125 15:54:29.963519 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-1\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:30 crc kubenswrapper[4704]: I1125 15:54:30.012644 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:30 crc kubenswrapper[4704]: I1125 15:54:30.021310 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:30 crc kubenswrapper[4704]: I1125 15:54:30.262822 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 25 15:54:30 crc kubenswrapper[4704]: I1125 15:54:30.324746 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"41c5825a-c141-4f0f-84bf-b562a5decca3","Type":"ContainerStarted","Data":"2df6634eb386373eb55e12227d00522d581ce5a8408b8f3f3e77269ca730852c"} Nov 25 15:54:30 crc kubenswrapper[4704]: I1125 15:54:30.563380 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 25 15:54:30 crc kubenswrapper[4704]: W1125 15:54:30.572012 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3b45c53_3450_4ea9_91a1_9de1130f9b1e.slice/crio-9930010d34714f6092892ded00721cbf0987317be0fa35785600168a3f4f088d WatchSource:0}: Error finding container 9930010d34714f6092892ded00721cbf0987317be0fa35785600168a3f4f088d: Status 404 returned error can't find the container with id 9930010d34714f6092892ded00721cbf0987317be0fa35785600168a3f4f088d Nov 25 15:54:31 crc kubenswrapper[4704]: I1125 15:54:31.336498 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"c3b45c53-3450-4ea9-91a1-9de1130f9b1e","Type":"ContainerStarted","Data":"1a28d3cfeb2999cc01c4ebc1f2dceae85785afdd2581cc809ffcbf85edba6d89"} Nov 25 15:54:31 crc kubenswrapper[4704]: I1125 15:54:31.337391 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"c3b45c53-3450-4ea9-91a1-9de1130f9b1e","Type":"ContainerStarted","Data":"9930010d34714f6092892ded00721cbf0987317be0fa35785600168a3f4f088d"} Nov 25 15:54:31 crc kubenswrapper[4704]: I1125 15:54:31.339688 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"41c5825a-c141-4f0f-84bf-b562a5decca3","Type":"ContainerStarted","Data":"0f5ad9b8d281af0e35b52be7e12df9e82b8e7876d4b2e19f5b11ff93d830bab6"} Nov 25 15:54:32 crc kubenswrapper[4704]: I1125 15:54:32.154457 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 25 15:54:32 crc kubenswrapper[4704]: I1125 15:54:32.348012 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"41c5825a-c141-4f0f-84bf-b562a5decca3","Type":"ContainerStarted","Data":"7189c2d812a54a30357ae17b16aab43a026742fea9da6a3002dc36efb3e49c5c"} Nov 25 15:54:32 crc kubenswrapper[4704]: I1125 15:54:32.350072 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"c3b45c53-3450-4ea9-91a1-9de1130f9b1e","Type":"ContainerStarted","Data":"e5fa0243351302e39cffa15e5e99f7cb39510df7c38ef0f09599b99ebb19f047"} Nov 25 15:54:32 crc kubenswrapper[4704]: I1125 15:54:32.387004 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=4.386983473 podStartE2EDuration="4.386983473s" podCreationTimestamp="2025-11-25 15:54:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:54:32.37925862 +0000 UTC m=+1158.647532401" watchObservedRunningTime="2025-11-25 15:54:32.386983473 +0000 UTC m=+1158.655257254" Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.356377 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="c3b45c53-3450-4ea9-91a1-9de1130f9b1e" containerName="glance-log" containerID="cri-o://1a28d3cfeb2999cc01c4ebc1f2dceae85785afdd2581cc809ffcbf85edba6d89" gracePeriod=30 Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.356541 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="c3b45c53-3450-4ea9-91a1-9de1130f9b1e" containerName="glance-httpd" containerID="cri-o://e5fa0243351302e39cffa15e5e99f7cb39510df7c38ef0f09599b99ebb19f047" gracePeriod=30 Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.746656 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.900041 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-logs\") pod \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.900120 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-etc-nvme\") pod \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.900162 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-sys\") pod \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.900193 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-dev\") pod \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.900209 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-lib-modules\") pod \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.900236 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-httpd-run\") pod \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.900261 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-var-locks-brick\") pod \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.900288 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.900315 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-config-data\") pod \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.900333 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-run\") pod \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.900356 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-scripts\") pod \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.900412 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.900460 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-etc-iscsi\") pod \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.900482 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glqwd\" (UniqueName: \"kubernetes.io/projected/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-kube-api-access-glqwd\") pod \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\" (UID: \"c3b45c53-3450-4ea9-91a1-9de1130f9b1e\") " Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.900893 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-logs" (OuterVolumeSpecName: "logs") pod "c3b45c53-3450-4ea9-91a1-9de1130f9b1e" (UID: "c3b45c53-3450-4ea9-91a1-9de1130f9b1e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.900966 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "c3b45c53-3450-4ea9-91a1-9de1130f9b1e" (UID: "c3b45c53-3450-4ea9-91a1-9de1130f9b1e"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.900994 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "c3b45c53-3450-4ea9-91a1-9de1130f9b1e" (UID: "c3b45c53-3450-4ea9-91a1-9de1130f9b1e"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.901016 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-sys" (OuterVolumeSpecName: "sys") pod "c3b45c53-3450-4ea9-91a1-9de1130f9b1e" (UID: "c3b45c53-3450-4ea9-91a1-9de1130f9b1e"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.901039 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-dev" (OuterVolumeSpecName: "dev") pod "c3b45c53-3450-4ea9-91a1-9de1130f9b1e" (UID: "c3b45c53-3450-4ea9-91a1-9de1130f9b1e"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.901064 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "c3b45c53-3450-4ea9-91a1-9de1130f9b1e" (UID: "c3b45c53-3450-4ea9-91a1-9de1130f9b1e"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.901300 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c3b45c53-3450-4ea9-91a1-9de1130f9b1e" (UID: "c3b45c53-3450-4ea9-91a1-9de1130f9b1e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.901419 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "c3b45c53-3450-4ea9-91a1-9de1130f9b1e" (UID: "c3b45c53-3450-4ea9-91a1-9de1130f9b1e"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.902015 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-run" (OuterVolumeSpecName: "run") pod "c3b45c53-3450-4ea9-91a1-9de1130f9b1e" (UID: "c3b45c53-3450-4ea9-91a1-9de1130f9b1e"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.912146 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-kube-api-access-glqwd" (OuterVolumeSpecName: "kube-api-access-glqwd") pod "c3b45c53-3450-4ea9-91a1-9de1130f9b1e" (UID: "c3b45c53-3450-4ea9-91a1-9de1130f9b1e"). InnerVolumeSpecName "kube-api-access-glqwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.912175 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "c3b45c53-3450-4ea9-91a1-9de1130f9b1e" (UID: "c3b45c53-3450-4ea9-91a1-9de1130f9b1e"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.912272 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-scripts" (OuterVolumeSpecName: "scripts") pod "c3b45c53-3450-4ea9-91a1-9de1130f9b1e" (UID: "c3b45c53-3450-4ea9-91a1-9de1130f9b1e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.926058 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance-cache") pod "c3b45c53-3450-4ea9-91a1-9de1130f9b1e" (UID: "c3b45c53-3450-4ea9-91a1-9de1130f9b1e"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 15:54:33 crc kubenswrapper[4704]: I1125 15:54:33.947545 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-config-data" (OuterVolumeSpecName: "config-data") pod "c3b45c53-3450-4ea9-91a1-9de1130f9b1e" (UID: "c3b45c53-3450-4ea9-91a1-9de1130f9b1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.002402 4704 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-logs\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.002469 4704 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.002485 4704 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-sys\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.002497 4704 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-dev\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.002510 4704 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.002524 4704 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.002537 4704 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.002574 4704 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.002587 4704 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.002604 4704 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-run\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.002619 4704 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.002637 4704 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.002648 4704 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.002660 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glqwd\" (UniqueName: \"kubernetes.io/projected/c3b45c53-3450-4ea9-91a1-9de1130f9b1e-kube-api-access-glqwd\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.017857 4704 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.019893 4704 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.104183 4704 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.104212 4704 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.364824 4704 generic.go:334] "Generic (PLEG): container finished" podID="c3b45c53-3450-4ea9-91a1-9de1130f9b1e" containerID="e5fa0243351302e39cffa15e5e99f7cb39510df7c38ef0f09599b99ebb19f047" exitCode=0 Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.364857 4704 generic.go:334] "Generic (PLEG): container finished" podID="c3b45c53-3450-4ea9-91a1-9de1130f9b1e" containerID="1a28d3cfeb2999cc01c4ebc1f2dceae85785afdd2581cc809ffcbf85edba6d89" exitCode=143 Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.364880 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"c3b45c53-3450-4ea9-91a1-9de1130f9b1e","Type":"ContainerDied","Data":"e5fa0243351302e39cffa15e5e99f7cb39510df7c38ef0f09599b99ebb19f047"} Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.364909 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"c3b45c53-3450-4ea9-91a1-9de1130f9b1e","Type":"ContainerDied","Data":"1a28d3cfeb2999cc01c4ebc1f2dceae85785afdd2581cc809ffcbf85edba6d89"} Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.364918 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"c3b45c53-3450-4ea9-91a1-9de1130f9b1e","Type":"ContainerDied","Data":"9930010d34714f6092892ded00721cbf0987317be0fa35785600168a3f4f088d"} Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.364934 4704 scope.go:117] "RemoveContainer" containerID="e5fa0243351302e39cffa15e5e99f7cb39510df7c38ef0f09599b99ebb19f047" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.365049 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.396258 4704 scope.go:117] "RemoveContainer" containerID="1a28d3cfeb2999cc01c4ebc1f2dceae85785afdd2581cc809ffcbf85edba6d89" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.402045 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.406613 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.424259 4704 scope.go:117] "RemoveContainer" containerID="e5fa0243351302e39cffa15e5e99f7cb39510df7c38ef0f09599b99ebb19f047" Nov 25 15:54:34 crc kubenswrapper[4704]: E1125 15:54:34.424756 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5fa0243351302e39cffa15e5e99f7cb39510df7c38ef0f09599b99ebb19f047\": container with ID starting with e5fa0243351302e39cffa15e5e99f7cb39510df7c38ef0f09599b99ebb19f047 not found: ID does not exist" containerID="e5fa0243351302e39cffa15e5e99f7cb39510df7c38ef0f09599b99ebb19f047" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.424852 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5fa0243351302e39cffa15e5e99f7cb39510df7c38ef0f09599b99ebb19f047"} err="failed to get container status \"e5fa0243351302e39cffa15e5e99f7cb39510df7c38ef0f09599b99ebb19f047\": rpc error: code = NotFound desc = could not find container \"e5fa0243351302e39cffa15e5e99f7cb39510df7c38ef0f09599b99ebb19f047\": container with ID starting with e5fa0243351302e39cffa15e5e99f7cb39510df7c38ef0f09599b99ebb19f047 not found: ID does not exist" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.424876 4704 scope.go:117] "RemoveContainer" containerID="1a28d3cfeb2999cc01c4ebc1f2dceae85785afdd2581cc809ffcbf85edba6d89" Nov 25 15:54:34 crc kubenswrapper[4704]: E1125 15:54:34.425245 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a28d3cfeb2999cc01c4ebc1f2dceae85785afdd2581cc809ffcbf85edba6d89\": container with ID starting with 1a28d3cfeb2999cc01c4ebc1f2dceae85785afdd2581cc809ffcbf85edba6d89 not found: ID does not exist" containerID="1a28d3cfeb2999cc01c4ebc1f2dceae85785afdd2581cc809ffcbf85edba6d89" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.425281 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a28d3cfeb2999cc01c4ebc1f2dceae85785afdd2581cc809ffcbf85edba6d89"} err="failed to get container status \"1a28d3cfeb2999cc01c4ebc1f2dceae85785afdd2581cc809ffcbf85edba6d89\": rpc error: code = NotFound desc = could not find container \"1a28d3cfeb2999cc01c4ebc1f2dceae85785afdd2581cc809ffcbf85edba6d89\": container with ID starting with 1a28d3cfeb2999cc01c4ebc1f2dceae85785afdd2581cc809ffcbf85edba6d89 not found: ID does not exist" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.425296 4704 scope.go:117] "RemoveContainer" containerID="e5fa0243351302e39cffa15e5e99f7cb39510df7c38ef0f09599b99ebb19f047" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.425536 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5fa0243351302e39cffa15e5e99f7cb39510df7c38ef0f09599b99ebb19f047"} err="failed to get container status \"e5fa0243351302e39cffa15e5e99f7cb39510df7c38ef0f09599b99ebb19f047\": rpc error: code = NotFound desc = could not find container \"e5fa0243351302e39cffa15e5e99f7cb39510df7c38ef0f09599b99ebb19f047\": container with ID starting with e5fa0243351302e39cffa15e5e99f7cb39510df7c38ef0f09599b99ebb19f047 not found: ID does not exist" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.425559 4704 scope.go:117] "RemoveContainer" containerID="1a28d3cfeb2999cc01c4ebc1f2dceae85785afdd2581cc809ffcbf85edba6d89" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.425766 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a28d3cfeb2999cc01c4ebc1f2dceae85785afdd2581cc809ffcbf85edba6d89"} err="failed to get container status \"1a28d3cfeb2999cc01c4ebc1f2dceae85785afdd2581cc809ffcbf85edba6d89\": rpc error: code = NotFound desc = could not find container \"1a28d3cfeb2999cc01c4ebc1f2dceae85785afdd2581cc809ffcbf85edba6d89\": container with ID starting with 1a28d3cfeb2999cc01c4ebc1f2dceae85785afdd2581cc809ffcbf85edba6d89 not found: ID does not exist" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.426841 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3b45c53-3450-4ea9-91a1-9de1130f9b1e" path="/var/lib/kubelet/pods/c3b45c53-3450-4ea9-91a1-9de1130f9b1e/volumes" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.428173 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 25 15:54:34 crc kubenswrapper[4704]: E1125 15:54:34.428415 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3b45c53-3450-4ea9-91a1-9de1130f9b1e" containerName="glance-httpd" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.428433 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3b45c53-3450-4ea9-91a1-9de1130f9b1e" containerName="glance-httpd" Nov 25 15:54:34 crc kubenswrapper[4704]: E1125 15:54:34.428457 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3b45c53-3450-4ea9-91a1-9de1130f9b1e" containerName="glance-log" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.428465 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3b45c53-3450-4ea9-91a1-9de1130f9b1e" containerName="glance-log" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.428589 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3b45c53-3450-4ea9-91a1-9de1130f9b1e" containerName="glance-log" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.428609 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3b45c53-3450-4ea9-91a1-9de1130f9b1e" containerName="glance-httpd" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.430095 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.442517 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.613135 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.613231 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.613270 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92b220a0-eefb-435c-92e7-f078d294689f-logs\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.613298 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-sys\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.613388 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/92b220a0-eefb-435c-92e7-f078d294689f-httpd-run\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.613422 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2rkd\" (UniqueName: \"kubernetes.io/projected/92b220a0-eefb-435c-92e7-f078d294689f-kube-api-access-f2rkd\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.613445 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92b220a0-eefb-435c-92e7-f078d294689f-scripts\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.613486 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-etc-nvme\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.613512 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.613526 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-run\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.613547 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-dev\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.613566 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-lib-modules\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.613581 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.613602 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92b220a0-eefb-435c-92e7-f078d294689f-config-data\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.715218 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2rkd\" (UniqueName: \"kubernetes.io/projected/92b220a0-eefb-435c-92e7-f078d294689f-kube-api-access-f2rkd\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.715276 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92b220a0-eefb-435c-92e7-f078d294689f-scripts\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.715349 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-etc-nvme\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.715378 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-run\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.715397 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.715425 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-dev\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.715446 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-lib-modules\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.715464 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.715490 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92b220a0-eefb-435c-92e7-f078d294689f-config-data\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.715513 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.715542 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.715572 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92b220a0-eefb-435c-92e7-f078d294689f-logs\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.715606 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-sys\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.715628 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/92b220a0-eefb-435c-92e7-f078d294689f-httpd-run\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.715744 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-etc-nvme\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.715682 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-run\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.715690 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-lib-modules\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.716266 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.716313 4704 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.716319 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-sys\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.716311 4704 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") device mount path \"/mnt/openstack/pv10\"" pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.716344 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.715720 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-dev\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.716751 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92b220a0-eefb-435c-92e7-f078d294689f-logs\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.727682 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/92b220a0-eefb-435c-92e7-f078d294689f-httpd-run\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.732413 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92b220a0-eefb-435c-92e7-f078d294689f-config-data\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.739539 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92b220a0-eefb-435c-92e7-f078d294689f-scripts\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.740734 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2rkd\" (UniqueName: \"kubernetes.io/projected/92b220a0-eefb-435c-92e7-f078d294689f-kube-api-access-f2rkd\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.745742 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:34 crc kubenswrapper[4704]: I1125 15:54:34.747452 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-single-1\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:35 crc kubenswrapper[4704]: I1125 15:54:35.047237 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:35 crc kubenswrapper[4704]: I1125 15:54:35.524772 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 25 15:54:36 crc kubenswrapper[4704]: I1125 15:54:36.382304 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"92b220a0-eefb-435c-92e7-f078d294689f","Type":"ContainerStarted","Data":"f8a3fa677ec9aedeb7ea9bb40f61d3dcca235bcc00252324cb128c1b2609ee71"} Nov 25 15:54:36 crc kubenswrapper[4704]: I1125 15:54:36.382825 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"92b220a0-eefb-435c-92e7-f078d294689f","Type":"ContainerStarted","Data":"f5be42917874dd904adde8e131e4aaec272ff241df8f5034c49efd4e4e0b1192"} Nov 25 15:54:36 crc kubenswrapper[4704]: I1125 15:54:36.382842 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"92b220a0-eefb-435c-92e7-f078d294689f","Type":"ContainerStarted","Data":"b272ab752c32222bda2309162307ac74f1d05e7cdd54b872c972546359cf5b09"} Nov 25 15:54:36 crc kubenswrapper[4704]: I1125 15:54:36.403746 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-1" podStartSLOduration=2.4037222480000002 podStartE2EDuration="2.403722248s" podCreationTimestamp="2025-11-25 15:54:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:54:36.40068108 +0000 UTC m=+1162.668954871" watchObservedRunningTime="2025-11-25 15:54:36.403722248 +0000 UTC m=+1162.671996019" Nov 25 15:54:40 crc kubenswrapper[4704]: I1125 15:54:40.014652 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:40 crc kubenswrapper[4704]: I1125 15:54:40.014716 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:40 crc kubenswrapper[4704]: I1125 15:54:40.043563 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:40 crc kubenswrapper[4704]: I1125 15:54:40.055621 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:40 crc kubenswrapper[4704]: I1125 15:54:40.410556 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:40 crc kubenswrapper[4704]: I1125 15:54:40.411102 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:42 crc kubenswrapper[4704]: I1125 15:54:42.431466 4704 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 15:54:42 crc kubenswrapper[4704]: I1125 15:54:42.431955 4704 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 15:54:42 crc kubenswrapper[4704]: I1125 15:54:42.769253 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:42 crc kubenswrapper[4704]: I1125 15:54:42.835290 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:45 crc kubenswrapper[4704]: I1125 15:54:45.048429 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:45 crc kubenswrapper[4704]: I1125 15:54:45.049117 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:45 crc kubenswrapper[4704]: I1125 15:54:45.087465 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:45 crc kubenswrapper[4704]: I1125 15:54:45.089086 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:45 crc kubenswrapper[4704]: I1125 15:54:45.453124 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:45 crc kubenswrapper[4704]: I1125 15:54:45.453182 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:47 crc kubenswrapper[4704]: I1125 15:54:47.686120 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:47 crc kubenswrapper[4704]: I1125 15:54:47.686684 4704 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 15:54:47 crc kubenswrapper[4704]: I1125 15:54:47.689166 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:54:47 crc kubenswrapper[4704]: I1125 15:54:47.746550 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 25 15:54:47 crc kubenswrapper[4704]: I1125 15:54:47.746871 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="41c5825a-c141-4f0f-84bf-b562a5decca3" containerName="glance-log" containerID="cri-o://0f5ad9b8d281af0e35b52be7e12df9e82b8e7876d4b2e19f5b11ff93d830bab6" gracePeriod=30 Nov 25 15:54:47 crc kubenswrapper[4704]: I1125 15:54:47.746990 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="41c5825a-c141-4f0f-84bf-b562a5decca3" containerName="glance-httpd" containerID="cri-o://7189c2d812a54a30357ae17b16aab43a026742fea9da6a3002dc36efb3e49c5c" gracePeriod=30 Nov 25 15:54:48 crc kubenswrapper[4704]: I1125 15:54:48.478137 4704 generic.go:334] "Generic (PLEG): container finished" podID="41c5825a-c141-4f0f-84bf-b562a5decca3" containerID="0f5ad9b8d281af0e35b52be7e12df9e82b8e7876d4b2e19f5b11ff93d830bab6" exitCode=143 Nov 25 15:54:48 crc kubenswrapper[4704]: I1125 15:54:48.478194 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"41c5825a-c141-4f0f-84bf-b562a5decca3","Type":"ContainerDied","Data":"0f5ad9b8d281af0e35b52be7e12df9e82b8e7876d4b2e19f5b11ff93d830bab6"} Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.291607 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.368218 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"41c5825a-c141-4f0f-84bf-b562a5decca3\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.368310 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-dev\") pod \"41c5825a-c141-4f0f-84bf-b562a5decca3\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.368344 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41c5825a-c141-4f0f-84bf-b562a5decca3-logs\") pod \"41c5825a-c141-4f0f-84bf-b562a5decca3\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.368372 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-sys\") pod \"41c5825a-c141-4f0f-84bf-b562a5decca3\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.368391 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41c5825a-c141-4f0f-84bf-b562a5decca3-scripts\") pod \"41c5825a-c141-4f0f-84bf-b562a5decca3\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.368417 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c5825a-c141-4f0f-84bf-b562a5decca3-config-data\") pod \"41c5825a-c141-4f0f-84bf-b562a5decca3\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.368439 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"41c5825a-c141-4f0f-84bf-b562a5decca3\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.368460 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-etc-nvme\") pod \"41c5825a-c141-4f0f-84bf-b562a5decca3\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.368486 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hm84\" (UniqueName: \"kubernetes.io/projected/41c5825a-c141-4f0f-84bf-b562a5decca3-kube-api-access-7hm84\") pod \"41c5825a-c141-4f0f-84bf-b562a5decca3\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.368514 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-run\") pod \"41c5825a-c141-4f0f-84bf-b562a5decca3\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.368542 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-var-locks-brick\") pod \"41c5825a-c141-4f0f-84bf-b562a5decca3\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.368562 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-etc-iscsi\") pod \"41c5825a-c141-4f0f-84bf-b562a5decca3\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.368627 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-lib-modules\") pod \"41c5825a-c141-4f0f-84bf-b562a5decca3\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.368721 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/41c5825a-c141-4f0f-84bf-b562a5decca3-httpd-run\") pod \"41c5825a-c141-4f0f-84bf-b562a5decca3\" (UID: \"41c5825a-c141-4f0f-84bf-b562a5decca3\") " Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.369536 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41c5825a-c141-4f0f-84bf-b562a5decca3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "41c5825a-c141-4f0f-84bf-b562a5decca3" (UID: "41c5825a-c141-4f0f-84bf-b562a5decca3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.370803 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-run" (OuterVolumeSpecName: "run") pod "41c5825a-c141-4f0f-84bf-b562a5decca3" (UID: "41c5825a-c141-4f0f-84bf-b562a5decca3"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.370850 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-dev" (OuterVolumeSpecName: "dev") pod "41c5825a-c141-4f0f-84bf-b562a5decca3" (UID: "41c5825a-c141-4f0f-84bf-b562a5decca3"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.370844 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "41c5825a-c141-4f0f-84bf-b562a5decca3" (UID: "41c5825a-c141-4f0f-84bf-b562a5decca3"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.370803 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-sys" (OuterVolumeSpecName: "sys") pod "41c5825a-c141-4f0f-84bf-b562a5decca3" (UID: "41c5825a-c141-4f0f-84bf-b562a5decca3"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.371108 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "41c5825a-c141-4f0f-84bf-b562a5decca3" (UID: "41c5825a-c141-4f0f-84bf-b562a5decca3"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.371167 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "41c5825a-c141-4f0f-84bf-b562a5decca3" (UID: "41c5825a-c141-4f0f-84bf-b562a5decca3"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.371285 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "41c5825a-c141-4f0f-84bf-b562a5decca3" (UID: "41c5825a-c141-4f0f-84bf-b562a5decca3"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.371328 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41c5825a-c141-4f0f-84bf-b562a5decca3-logs" (OuterVolumeSpecName: "logs") pod "41c5825a-c141-4f0f-84bf-b562a5decca3" (UID: "41c5825a-c141-4f0f-84bf-b562a5decca3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.379206 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance-cache") pod "41c5825a-c141-4f0f-84bf-b562a5decca3" (UID: "41c5825a-c141-4f0f-84bf-b562a5decca3"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.381131 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41c5825a-c141-4f0f-84bf-b562a5decca3-kube-api-access-7hm84" (OuterVolumeSpecName: "kube-api-access-7hm84") pod "41c5825a-c141-4f0f-84bf-b562a5decca3" (UID: "41c5825a-c141-4f0f-84bf-b562a5decca3"). InnerVolumeSpecName "kube-api-access-7hm84". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.385977 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "41c5825a-c141-4f0f-84bf-b562a5decca3" (UID: "41c5825a-c141-4f0f-84bf-b562a5decca3"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.390902 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41c5825a-c141-4f0f-84bf-b562a5decca3-scripts" (OuterVolumeSpecName: "scripts") pod "41c5825a-c141-4f0f-84bf-b562a5decca3" (UID: "41c5825a-c141-4f0f-84bf-b562a5decca3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.472784 4704 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.472868 4704 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.472885 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hm84\" (UniqueName: \"kubernetes.io/projected/41c5825a-c141-4f0f-84bf-b562a5decca3-kube-api-access-7hm84\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.472900 4704 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-run\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.472913 4704 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.472925 4704 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.472936 4704 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.472945 4704 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/41c5825a-c141-4f0f-84bf-b562a5decca3-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.472959 4704 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.472971 4704 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-dev\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.472982 4704 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41c5825a-c141-4f0f-84bf-b562a5decca3-logs\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.473005 4704 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41c5825a-c141-4f0f-84bf-b562a5decca3-sys\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.473019 4704 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41c5825a-c141-4f0f-84bf-b562a5decca3-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.477918 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41c5825a-c141-4f0f-84bf-b562a5decca3-config-data" (OuterVolumeSpecName: "config-data") pod "41c5825a-c141-4f0f-84bf-b562a5decca3" (UID: "41c5825a-c141-4f0f-84bf-b562a5decca3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.497828 4704 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.497897 4704 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.508475 4704 generic.go:334] "Generic (PLEG): container finished" podID="41c5825a-c141-4f0f-84bf-b562a5decca3" containerID="7189c2d812a54a30357ae17b16aab43a026742fea9da6a3002dc36efb3e49c5c" exitCode=0 Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.508532 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.508562 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"41c5825a-c141-4f0f-84bf-b562a5decca3","Type":"ContainerDied","Data":"7189c2d812a54a30357ae17b16aab43a026742fea9da6a3002dc36efb3e49c5c"} Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.509013 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"41c5825a-c141-4f0f-84bf-b562a5decca3","Type":"ContainerDied","Data":"2df6634eb386373eb55e12227d00522d581ce5a8408b8f3f3e77269ca730852c"} Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.509037 4704 scope.go:117] "RemoveContainer" containerID="7189c2d812a54a30357ae17b16aab43a026742fea9da6a3002dc36efb3e49c5c" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.532942 4704 scope.go:117] "RemoveContainer" containerID="0f5ad9b8d281af0e35b52be7e12df9e82b8e7876d4b2e19f5b11ff93d830bab6" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.550354 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.555775 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.556579 4704 scope.go:117] "RemoveContainer" containerID="7189c2d812a54a30357ae17b16aab43a026742fea9da6a3002dc36efb3e49c5c" Nov 25 15:54:51 crc kubenswrapper[4704]: E1125 15:54:51.557015 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7189c2d812a54a30357ae17b16aab43a026742fea9da6a3002dc36efb3e49c5c\": container with ID starting with 7189c2d812a54a30357ae17b16aab43a026742fea9da6a3002dc36efb3e49c5c not found: ID does not exist" containerID="7189c2d812a54a30357ae17b16aab43a026742fea9da6a3002dc36efb3e49c5c" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.557046 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7189c2d812a54a30357ae17b16aab43a026742fea9da6a3002dc36efb3e49c5c"} err="failed to get container status \"7189c2d812a54a30357ae17b16aab43a026742fea9da6a3002dc36efb3e49c5c\": rpc error: code = NotFound desc = could not find container \"7189c2d812a54a30357ae17b16aab43a026742fea9da6a3002dc36efb3e49c5c\": container with ID starting with 7189c2d812a54a30357ae17b16aab43a026742fea9da6a3002dc36efb3e49c5c not found: ID does not exist" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.557065 4704 scope.go:117] "RemoveContainer" containerID="0f5ad9b8d281af0e35b52be7e12df9e82b8e7876d4b2e19f5b11ff93d830bab6" Nov 25 15:54:51 crc kubenswrapper[4704]: E1125 15:54:51.557314 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f5ad9b8d281af0e35b52be7e12df9e82b8e7876d4b2e19f5b11ff93d830bab6\": container with ID starting with 0f5ad9b8d281af0e35b52be7e12df9e82b8e7876d4b2e19f5b11ff93d830bab6 not found: ID does not exist" containerID="0f5ad9b8d281af0e35b52be7e12df9e82b8e7876d4b2e19f5b11ff93d830bab6" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.557335 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f5ad9b8d281af0e35b52be7e12df9e82b8e7876d4b2e19f5b11ff93d830bab6"} err="failed to get container status \"0f5ad9b8d281af0e35b52be7e12df9e82b8e7876d4b2e19f5b11ff93d830bab6\": rpc error: code = NotFound desc = could not find container \"0f5ad9b8d281af0e35b52be7e12df9e82b8e7876d4b2e19f5b11ff93d830bab6\": container with ID starting with 0f5ad9b8d281af0e35b52be7e12df9e82b8e7876d4b2e19f5b11ff93d830bab6 not found: ID does not exist" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.578774 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 25 15:54:51 crc kubenswrapper[4704]: E1125 15:54:51.579131 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41c5825a-c141-4f0f-84bf-b562a5decca3" containerName="glance-log" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.579145 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c5825a-c141-4f0f-84bf-b562a5decca3" containerName="glance-log" Nov 25 15:54:51 crc kubenswrapper[4704]: E1125 15:54:51.579163 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41c5825a-c141-4f0f-84bf-b562a5decca3" containerName="glance-httpd" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.579169 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c5825a-c141-4f0f-84bf-b562a5decca3" containerName="glance-httpd" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.579306 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="41c5825a-c141-4f0f-84bf-b562a5decca3" containerName="glance-log" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.579316 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="41c5825a-c141-4f0f-84bf-b562a5decca3" containerName="glance-httpd" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.580114 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.582413 4704 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.582437 4704 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c5825a-c141-4f0f-84bf-b562a5decca3-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.582446 4704 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.595908 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.684612 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.684685 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-dev\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.684724 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a574a830-e183-484d-a06c-660b14b93539-logs\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.684749 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-sys\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.684774 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6562k\" (UniqueName: \"kubernetes.io/projected/a574a830-e183-484d-a06c-660b14b93539-kube-api-access-6562k\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.684818 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.684941 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.684983 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.685018 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-run\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.685112 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a574a830-e183-484d-a06c-660b14b93539-scripts\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.685144 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-lib-modules\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.685211 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a574a830-e183-484d-a06c-660b14b93539-config-data\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.685294 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-etc-nvme\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.685323 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a574a830-e183-484d-a06c-660b14b93539-httpd-run\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.787204 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a574a830-e183-484d-a06c-660b14b93539-httpd-run\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.787277 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.787324 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-dev\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.787354 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a574a830-e183-484d-a06c-660b14b93539-logs\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.787379 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-sys\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.787399 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6562k\" (UniqueName: \"kubernetes.io/projected/a574a830-e183-484d-a06c-660b14b93539-kube-api-access-6562k\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.787420 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.787453 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.787483 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.787507 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-run\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.787502 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-dev\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.787676 4704 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.787957 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a574a830-e183-484d-a06c-660b14b93539-httpd-run\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.787533 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a574a830-e183-484d-a06c-660b14b93539-scripts\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.788015 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a574a830-e183-484d-a06c-660b14b93539-logs\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.788035 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-lib-modules\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.788076 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a574a830-e183-484d-a06c-660b14b93539-config-data\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.788078 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.788104 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.788124 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-run\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.788121 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-sys\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.788176 4704 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.788216 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-lib-modules\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.788639 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-etc-nvme\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.788775 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-etc-nvme\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.792345 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a574a830-e183-484d-a06c-660b14b93539-config-data\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.793431 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a574a830-e183-484d-a06c-660b14b93539-scripts\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.808338 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6562k\" (UniqueName: \"kubernetes.io/projected/a574a830-e183-484d-a06c-660b14b93539-kube-api-access-6562k\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.811679 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.812137 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-0\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:51 crc kubenswrapper[4704]: I1125 15:54:51.899580 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:54:52 crc kubenswrapper[4704]: I1125 15:54:52.135437 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 25 15:54:52 crc kubenswrapper[4704]: I1125 15:54:52.425156 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41c5825a-c141-4f0f-84bf-b562a5decca3" path="/var/lib/kubelet/pods/41c5825a-c141-4f0f-84bf-b562a5decca3/volumes" Nov 25 15:54:52 crc kubenswrapper[4704]: I1125 15:54:52.521954 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a574a830-e183-484d-a06c-660b14b93539","Type":"ContainerStarted","Data":"1d3eda75f1e6c9474bd9dcaddcb0a430dae06ef3d9e46fcc5d627b5ee06ff52e"} Nov 25 15:54:52 crc kubenswrapper[4704]: I1125 15:54:52.522008 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a574a830-e183-484d-a06c-660b14b93539","Type":"ContainerStarted","Data":"d83a2ad6a169d38a648b6e361411654693f6dfad1526111757c1a3e0537f540e"} Nov 25 15:54:52 crc kubenswrapper[4704]: I1125 15:54:52.522020 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a574a830-e183-484d-a06c-660b14b93539","Type":"ContainerStarted","Data":"d05285c634a9735020cb61e954d886e67d7af77fa49d57088fd81ce5476fbed7"} Nov 25 15:54:52 crc kubenswrapper[4704]: I1125 15:54:52.552468 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=1.5524474910000001 podStartE2EDuration="1.552447491s" podCreationTimestamp="2025-11-25 15:54:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:54:52.547886679 +0000 UTC m=+1178.816160470" watchObservedRunningTime="2025-11-25 15:54:52.552447491 +0000 UTC m=+1178.820721272" Nov 25 15:55:01 crc kubenswrapper[4704]: I1125 15:55:01.900733 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:01 crc kubenswrapper[4704]: I1125 15:55:01.901884 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:01 crc kubenswrapper[4704]: I1125 15:55:01.928925 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:01 crc kubenswrapper[4704]: I1125 15:55:01.947291 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:02 crc kubenswrapper[4704]: I1125 15:55:02.588924 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:02 crc kubenswrapper[4704]: I1125 15:55:02.588964 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:04 crc kubenswrapper[4704]: I1125 15:55:04.986093 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:04 crc kubenswrapper[4704]: I1125 15:55:04.987042 4704 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 15:55:05 crc kubenswrapper[4704]: I1125 15:55:05.046379 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:22 crc kubenswrapper[4704]: I1125 15:55:22.880803 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-m7kk7"] Nov 25 15:55:22 crc kubenswrapper[4704]: I1125 15:55:22.891734 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-m7kk7"] Nov 25 15:55:22 crc kubenswrapper[4704]: I1125 15:55:22.969960 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance50cb-account-delete-j24ks"] Nov 25 15:55:22 crc kubenswrapper[4704]: I1125 15:55:22.971310 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance50cb-account-delete-j24ks" Nov 25 15:55:22 crc kubenswrapper[4704]: I1125 15:55:22.981645 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 25 15:55:22 crc kubenswrapper[4704]: I1125 15:55:22.982312 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="92b220a0-eefb-435c-92e7-f078d294689f" containerName="glance-log" containerID="cri-o://f5be42917874dd904adde8e131e4aaec272ff241df8f5034c49efd4e4e0b1192" gracePeriod=30 Nov 25 15:55:22 crc kubenswrapper[4704]: I1125 15:55:22.982456 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="92b220a0-eefb-435c-92e7-f078d294689f" containerName="glance-httpd" containerID="cri-o://f8a3fa677ec9aedeb7ea9bb40f61d3dcca235bcc00252324cb128c1b2609ee71" gracePeriod=30 Nov 25 15:55:22 crc kubenswrapper[4704]: I1125 15:55:22.999515 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.000750 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="a574a830-e183-484d-a06c-660b14b93539" containerName="glance-log" containerID="cri-o://d83a2ad6a169d38a648b6e361411654693f6dfad1526111757c1a3e0537f540e" gracePeriod=30 Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.000818 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="a574a830-e183-484d-a06c-660b14b93539" containerName="glance-httpd" containerID="cri-o://1d3eda75f1e6c9474bd9dcaddcb0a430dae06ef3d9e46fcc5d627b5ee06ff52e" gracePeriod=30 Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.012587 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance50cb-account-delete-j24ks"] Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.085978 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn7h8\" (UniqueName: \"kubernetes.io/projected/f7ea419f-3967-4d59-8dea-9fc4fa003b75-kube-api-access-wn7h8\") pod \"glance50cb-account-delete-j24ks\" (UID: \"f7ea419f-3967-4d59-8dea-9fc4fa003b75\") " pod="glance-kuttl-tests/glance50cb-account-delete-j24ks" Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.086076 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7ea419f-3967-4d59-8dea-9fc4fa003b75-operator-scripts\") pod \"glance50cb-account-delete-j24ks\" (UID: \"f7ea419f-3967-4d59-8dea-9fc4fa003b75\") " pod="glance-kuttl-tests/glance50cb-account-delete-j24ks" Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.150341 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstackclient"] Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.150647 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/openstackclient" podUID="5f5fd1d0-8bba-442f-a35c-15cfcf49a92f" containerName="openstackclient" containerID="cri-o://d19757ffcc5355812e84aeccf0d3a2bf4b6b587c36b7dd634eb55bdbf36104a3" gracePeriod=30 Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.187800 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7ea419f-3967-4d59-8dea-9fc4fa003b75-operator-scripts\") pod \"glance50cb-account-delete-j24ks\" (UID: \"f7ea419f-3967-4d59-8dea-9fc4fa003b75\") " pod="glance-kuttl-tests/glance50cb-account-delete-j24ks" Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.187944 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn7h8\" (UniqueName: \"kubernetes.io/projected/f7ea419f-3967-4d59-8dea-9fc4fa003b75-kube-api-access-wn7h8\") pod \"glance50cb-account-delete-j24ks\" (UID: \"f7ea419f-3967-4d59-8dea-9fc4fa003b75\") " pod="glance-kuttl-tests/glance50cb-account-delete-j24ks" Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.189370 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7ea419f-3967-4d59-8dea-9fc4fa003b75-operator-scripts\") pod \"glance50cb-account-delete-j24ks\" (UID: \"f7ea419f-3967-4d59-8dea-9fc4fa003b75\") " pod="glance-kuttl-tests/glance50cb-account-delete-j24ks" Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.214756 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn7h8\" (UniqueName: \"kubernetes.io/projected/f7ea419f-3967-4d59-8dea-9fc4fa003b75-kube-api-access-wn7h8\") pod \"glance50cb-account-delete-j24ks\" (UID: \"f7ea419f-3967-4d59-8dea-9fc4fa003b75\") " pod="glance-kuttl-tests/glance50cb-account-delete-j24ks" Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.300958 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance50cb-account-delete-j24ks" Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.562260 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.699146 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5f5fd1d0-8bba-442f-a35c-15cfcf49a92f-openstack-config-secret\") pod \"5f5fd1d0-8bba-442f-a35c-15cfcf49a92f\" (UID: \"5f5fd1d0-8bba-442f-a35c-15cfcf49a92f\") " Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.699219 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdjlr\" (UniqueName: \"kubernetes.io/projected/5f5fd1d0-8bba-442f-a35c-15cfcf49a92f-kube-api-access-tdjlr\") pod \"5f5fd1d0-8bba-442f-a35c-15cfcf49a92f\" (UID: \"5f5fd1d0-8bba-442f-a35c-15cfcf49a92f\") " Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.699295 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5f5fd1d0-8bba-442f-a35c-15cfcf49a92f-openstack-config\") pod \"5f5fd1d0-8bba-442f-a35c-15cfcf49a92f\" (UID: \"5f5fd1d0-8bba-442f-a35c-15cfcf49a92f\") " Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.699339 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/5f5fd1d0-8bba-442f-a35c-15cfcf49a92f-openstack-scripts\") pod \"5f5fd1d0-8bba-442f-a35c-15cfcf49a92f\" (UID: \"5f5fd1d0-8bba-442f-a35c-15cfcf49a92f\") " Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.701319 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f5fd1d0-8bba-442f-a35c-15cfcf49a92f-openstack-scripts" (OuterVolumeSpecName: "openstack-scripts") pod "5f5fd1d0-8bba-442f-a35c-15cfcf49a92f" (UID: "5f5fd1d0-8bba-442f-a35c-15cfcf49a92f"). InnerVolumeSpecName "openstack-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.704460 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f5fd1d0-8bba-442f-a35c-15cfcf49a92f-kube-api-access-tdjlr" (OuterVolumeSpecName: "kube-api-access-tdjlr") pod "5f5fd1d0-8bba-442f-a35c-15cfcf49a92f" (UID: "5f5fd1d0-8bba-442f-a35c-15cfcf49a92f"). InnerVolumeSpecName "kube-api-access-tdjlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.719679 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f5fd1d0-8bba-442f-a35c-15cfcf49a92f-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "5f5fd1d0-8bba-442f-a35c-15cfcf49a92f" (UID: "5f5fd1d0-8bba-442f-a35c-15cfcf49a92f"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.724226 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f5fd1d0-8bba-442f-a35c-15cfcf49a92f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "5f5fd1d0-8bba-442f-a35c-15cfcf49a92f" (UID: "5f5fd1d0-8bba-442f-a35c-15cfcf49a92f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.746593 4704 generic.go:334] "Generic (PLEG): container finished" podID="92b220a0-eefb-435c-92e7-f078d294689f" containerID="f5be42917874dd904adde8e131e4aaec272ff241df8f5034c49efd4e4e0b1192" exitCode=143 Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.746647 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"92b220a0-eefb-435c-92e7-f078d294689f","Type":"ContainerDied","Data":"f5be42917874dd904adde8e131e4aaec272ff241df8f5034c49efd4e4e0b1192"} Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.748611 4704 generic.go:334] "Generic (PLEG): container finished" podID="5f5fd1d0-8bba-442f-a35c-15cfcf49a92f" containerID="d19757ffcc5355812e84aeccf0d3a2bf4b6b587c36b7dd634eb55bdbf36104a3" exitCode=143 Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.748680 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.748701 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"5f5fd1d0-8bba-442f-a35c-15cfcf49a92f","Type":"ContainerDied","Data":"d19757ffcc5355812e84aeccf0d3a2bf4b6b587c36b7dd634eb55bdbf36104a3"} Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.748784 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"5f5fd1d0-8bba-442f-a35c-15cfcf49a92f","Type":"ContainerDied","Data":"b9961e751f697547cac820249b7bc822fc8f48f963710d94e1f0cf9f7b824f37"} Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.748830 4704 scope.go:117] "RemoveContainer" containerID="d19757ffcc5355812e84aeccf0d3a2bf4b6b587c36b7dd634eb55bdbf36104a3" Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.754218 4704 generic.go:334] "Generic (PLEG): container finished" podID="a574a830-e183-484d-a06c-660b14b93539" containerID="d83a2ad6a169d38a648b6e361411654693f6dfad1526111757c1a3e0537f540e" exitCode=143 Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.754261 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a574a830-e183-484d-a06c-660b14b93539","Type":"ContainerDied","Data":"d83a2ad6a169d38a648b6e361411654693f6dfad1526111757c1a3e0537f540e"} Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.783593 4704 scope.go:117] "RemoveContainer" containerID="d19757ffcc5355812e84aeccf0d3a2bf4b6b587c36b7dd634eb55bdbf36104a3" Nov 25 15:55:23 crc kubenswrapper[4704]: E1125 15:55:23.784253 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d19757ffcc5355812e84aeccf0d3a2bf4b6b587c36b7dd634eb55bdbf36104a3\": container with ID starting with d19757ffcc5355812e84aeccf0d3a2bf4b6b587c36b7dd634eb55bdbf36104a3 not found: ID does not exist" containerID="d19757ffcc5355812e84aeccf0d3a2bf4b6b587c36b7dd634eb55bdbf36104a3" Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.784342 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d19757ffcc5355812e84aeccf0d3a2bf4b6b587c36b7dd634eb55bdbf36104a3"} err="failed to get container status \"d19757ffcc5355812e84aeccf0d3a2bf4b6b587c36b7dd634eb55bdbf36104a3\": rpc error: code = NotFound desc = could not find container \"d19757ffcc5355812e84aeccf0d3a2bf4b6b587c36b7dd634eb55bdbf36104a3\": container with ID starting with d19757ffcc5355812e84aeccf0d3a2bf4b6b587c36b7dd634eb55bdbf36104a3 not found: ID does not exist" Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.789133 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstackclient"] Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.800077 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/openstackclient"] Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.800585 4704 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5f5fd1d0-8bba-442f-a35c-15cfcf49a92f-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.800613 4704 reconciler_common.go:293] "Volume detached for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/5f5fd1d0-8bba-442f-a35c-15cfcf49a92f-openstack-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.800623 4704 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5f5fd1d0-8bba-442f-a35c-15cfcf49a92f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.800632 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdjlr\" (UniqueName: \"kubernetes.io/projected/5f5fd1d0-8bba-442f-a35c-15cfcf49a92f-kube-api-access-tdjlr\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:23 crc kubenswrapper[4704]: I1125 15:55:23.809268 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance50cb-account-delete-j24ks"] Nov 25 15:55:24 crc kubenswrapper[4704]: I1125 15:55:24.425913 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f5fd1d0-8bba-442f-a35c-15cfcf49a92f" path="/var/lib/kubelet/pods/5f5fd1d0-8bba-442f-a35c-15cfcf49a92f/volumes" Nov 25 15:55:24 crc kubenswrapper[4704]: I1125 15:55:24.426841 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0d727c8-1b1e-4a10-ad2b-7500a2d78d44" path="/var/lib/kubelet/pods/b0d727c8-1b1e-4a10-ad2b-7500a2d78d44/volumes" Nov 25 15:55:24 crc kubenswrapper[4704]: I1125 15:55:24.763577 4704 generic.go:334] "Generic (PLEG): container finished" podID="f7ea419f-3967-4d59-8dea-9fc4fa003b75" containerID="67f51b18809bc1bca8a1c59046482cad9b26e385c77db7991f668f447d5972dc" exitCode=0 Nov 25 15:55:24 crc kubenswrapper[4704]: I1125 15:55:24.763641 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance50cb-account-delete-j24ks" event={"ID":"f7ea419f-3967-4d59-8dea-9fc4fa003b75","Type":"ContainerDied","Data":"67f51b18809bc1bca8a1c59046482cad9b26e385c77db7991f668f447d5972dc"} Nov 25 15:55:24 crc kubenswrapper[4704]: I1125 15:55:24.763702 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance50cb-account-delete-j24ks" event={"ID":"f7ea419f-3967-4d59-8dea-9fc4fa003b75","Type":"ContainerStarted","Data":"eb26caad517002fdafd55bdaeda913fd06ba97a8de6eeb6bcd78b9aa73c70122"} Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.042104 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance50cb-account-delete-j24ks" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.235451 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7ea419f-3967-4d59-8dea-9fc4fa003b75-operator-scripts\") pod \"f7ea419f-3967-4d59-8dea-9fc4fa003b75\" (UID: \"f7ea419f-3967-4d59-8dea-9fc4fa003b75\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.235495 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn7h8\" (UniqueName: \"kubernetes.io/projected/f7ea419f-3967-4d59-8dea-9fc4fa003b75-kube-api-access-wn7h8\") pod \"f7ea419f-3967-4d59-8dea-9fc4fa003b75\" (UID: \"f7ea419f-3967-4d59-8dea-9fc4fa003b75\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.236519 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7ea419f-3967-4d59-8dea-9fc4fa003b75-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f7ea419f-3967-4d59-8dea-9fc4fa003b75" (UID: "f7ea419f-3967-4d59-8dea-9fc4fa003b75"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.243685 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7ea419f-3967-4d59-8dea-9fc4fa003b75-kube-api-access-wn7h8" (OuterVolumeSpecName: "kube-api-access-wn7h8") pod "f7ea419f-3967-4d59-8dea-9fc4fa003b75" (UID: "f7ea419f-3967-4d59-8dea-9fc4fa003b75"). InnerVolumeSpecName "kube-api-access-wn7h8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.338526 4704 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7ea419f-3967-4d59-8dea-9fc4fa003b75-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.338564 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn7h8\" (UniqueName: \"kubernetes.io/projected/f7ea419f-3967-4d59-8dea-9fc4fa003b75-kube-api-access-wn7h8\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.506165 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.617469 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.643954 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2rkd\" (UniqueName: \"kubernetes.io/projected/92b220a0-eefb-435c-92e7-f078d294689f-kube-api-access-f2rkd\") pod \"92b220a0-eefb-435c-92e7-f078d294689f\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.643995 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-lib-modules\") pod \"92b220a0-eefb-435c-92e7-f078d294689f\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644037 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/92b220a0-eefb-435c-92e7-f078d294689f-httpd-run\") pod \"92b220a0-eefb-435c-92e7-f078d294689f\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644195 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-etc-iscsi\") pod \"92b220a0-eefb-435c-92e7-f078d294689f\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644224 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-run\") pod \"a574a830-e183-484d-a06c-660b14b93539\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644250 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92b220a0-eefb-435c-92e7-f078d294689f-config-data\") pod \"92b220a0-eefb-435c-92e7-f078d294689f\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644276 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a574a830-e183-484d-a06c-660b14b93539-httpd-run\") pod \"a574a830-e183-484d-a06c-660b14b93539\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644316 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"92b220a0-eefb-435c-92e7-f078d294689f\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644337 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-sys\") pod \"92b220a0-eefb-435c-92e7-f078d294689f\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644369 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-run\") pod \"92b220a0-eefb-435c-92e7-f078d294689f\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644394 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92b220a0-eefb-435c-92e7-f078d294689f-scripts\") pod \"92b220a0-eefb-435c-92e7-f078d294689f\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644413 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-lib-modules\") pod \"a574a830-e183-484d-a06c-660b14b93539\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644434 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92b220a0-eefb-435c-92e7-f078d294689f-logs\") pod \"92b220a0-eefb-435c-92e7-f078d294689f\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644450 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-sys\") pod \"a574a830-e183-484d-a06c-660b14b93539\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644463 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92b220a0-eefb-435c-92e7-f078d294689f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "92b220a0-eefb-435c-92e7-f078d294689f" (UID: "92b220a0-eefb-435c-92e7-f078d294689f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644471 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-var-locks-brick\") pod \"92b220a0-eefb-435c-92e7-f078d294689f\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644498 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "92b220a0-eefb-435c-92e7-f078d294689f" (UID: "92b220a0-eefb-435c-92e7-f078d294689f"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644506 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-dev\") pod \"92b220a0-eefb-435c-92e7-f078d294689f\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644525 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"92b220a0-eefb-435c-92e7-f078d294689f\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644542 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a574a830-e183-484d-a06c-660b14b93539-logs\") pod \"a574a830-e183-484d-a06c-660b14b93539\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644559 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-var-locks-brick\") pod \"a574a830-e183-484d-a06c-660b14b93539\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644575 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"a574a830-e183-484d-a06c-660b14b93539\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644599 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6562k\" (UniqueName: \"kubernetes.io/projected/a574a830-e183-484d-a06c-660b14b93539-kube-api-access-6562k\") pod \"a574a830-e183-484d-a06c-660b14b93539\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644613 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-etc-iscsi\") pod \"a574a830-e183-484d-a06c-660b14b93539\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644630 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a574a830-e183-484d-a06c-660b14b93539-scripts\") pod \"a574a830-e183-484d-a06c-660b14b93539\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644645 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-dev\") pod \"a574a830-e183-484d-a06c-660b14b93539\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644658 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-etc-nvme\") pod \"a574a830-e183-484d-a06c-660b14b93539\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644676 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-etc-nvme\") pod \"92b220a0-eefb-435c-92e7-f078d294689f\" (UID: \"92b220a0-eefb-435c-92e7-f078d294689f\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644696 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"a574a830-e183-484d-a06c-660b14b93539\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644884 4704 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644897 4704 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/92b220a0-eefb-435c-92e7-f078d294689f-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644524 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "92b220a0-eefb-435c-92e7-f078d294689f" (UID: "92b220a0-eefb-435c-92e7-f078d294689f"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644543 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-run" (OuterVolumeSpecName: "run") pod "a574a830-e183-484d-a06c-660b14b93539" (UID: "a574a830-e183-484d-a06c-660b14b93539"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.644885 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-run" (OuterVolumeSpecName: "run") pod "92b220a0-eefb-435c-92e7-f078d294689f" (UID: "92b220a0-eefb-435c-92e7-f078d294689f"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.645347 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "a574a830-e183-484d-a06c-660b14b93539" (UID: "a574a830-e183-484d-a06c-660b14b93539"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.646152 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a574a830-e183-484d-a06c-660b14b93539-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a574a830-e183-484d-a06c-660b14b93539" (UID: "a574a830-e183-484d-a06c-660b14b93539"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.646988 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-sys" (OuterVolumeSpecName: "sys") pod "a574a830-e183-484d-a06c-660b14b93539" (UID: "a574a830-e183-484d-a06c-660b14b93539"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.646995 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-sys" (OuterVolumeSpecName: "sys") pod "92b220a0-eefb-435c-92e7-f078d294689f" (UID: "92b220a0-eefb-435c-92e7-f078d294689f"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.647632 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-dev" (OuterVolumeSpecName: "dev") pod "a574a830-e183-484d-a06c-660b14b93539" (UID: "a574a830-e183-484d-a06c-660b14b93539"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.647691 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "92b220a0-eefb-435c-92e7-f078d294689f" (UID: "92b220a0-eefb-435c-92e7-f078d294689f"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.647728 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "92b220a0-eefb-435c-92e7-f078d294689f" (UID: "92b220a0-eefb-435c-92e7-f078d294689f"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.647711 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "a574a830-e183-484d-a06c-660b14b93539" (UID: "a574a830-e183-484d-a06c-660b14b93539"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.647753 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-dev" (OuterVolumeSpecName: "dev") pod "92b220a0-eefb-435c-92e7-f078d294689f" (UID: "92b220a0-eefb-435c-92e7-f078d294689f"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.647757 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "a574a830-e183-484d-a06c-660b14b93539" (UID: "a574a830-e183-484d-a06c-660b14b93539"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.648395 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "a574a830-e183-484d-a06c-660b14b93539" (UID: "a574a830-e183-484d-a06c-660b14b93539"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.648415 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92b220a0-eefb-435c-92e7-f078d294689f-logs" (OuterVolumeSpecName: "logs") pod "92b220a0-eefb-435c-92e7-f078d294689f" (UID: "92b220a0-eefb-435c-92e7-f078d294689f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.648619 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a574a830-e183-484d-a06c-660b14b93539-logs" (OuterVolumeSpecName: "logs") pod "a574a830-e183-484d-a06c-660b14b93539" (UID: "a574a830-e183-484d-a06c-660b14b93539"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.651674 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a574a830-e183-484d-a06c-660b14b93539-scripts" (OuterVolumeSpecName: "scripts") pod "a574a830-e183-484d-a06c-660b14b93539" (UID: "a574a830-e183-484d-a06c-660b14b93539"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.654790 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "a574a830-e183-484d-a06c-660b14b93539" (UID: "a574a830-e183-484d-a06c-660b14b93539"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.654778 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92b220a0-eefb-435c-92e7-f078d294689f-kube-api-access-f2rkd" (OuterVolumeSpecName: "kube-api-access-f2rkd") pod "92b220a0-eefb-435c-92e7-f078d294689f" (UID: "92b220a0-eefb-435c-92e7-f078d294689f"). InnerVolumeSpecName "kube-api-access-f2rkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.654790 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92b220a0-eefb-435c-92e7-f078d294689f-scripts" (OuterVolumeSpecName: "scripts") pod "92b220a0-eefb-435c-92e7-f078d294689f" (UID: "92b220a0-eefb-435c-92e7-f078d294689f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.657051 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a574a830-e183-484d-a06c-660b14b93539-kube-api-access-6562k" (OuterVolumeSpecName: "kube-api-access-6562k") pod "a574a830-e183-484d-a06c-660b14b93539" (UID: "a574a830-e183-484d-a06c-660b14b93539"). InnerVolumeSpecName "kube-api-access-6562k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.660107 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "92b220a0-eefb-435c-92e7-f078d294689f" (UID: "92b220a0-eefb-435c-92e7-f078d294689f"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.661188 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance-cache") pod "a574a830-e183-484d-a06c-660b14b93539" (UID: "a574a830-e183-484d-a06c-660b14b93539"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.661276 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance-cache") pod "92b220a0-eefb-435c-92e7-f078d294689f" (UID: "92b220a0-eefb-435c-92e7-f078d294689f"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.683116 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92b220a0-eefb-435c-92e7-f078d294689f-config-data" (OuterVolumeSpecName: "config-data") pod "92b220a0-eefb-435c-92e7-f078d294689f" (UID: "92b220a0-eefb-435c-92e7-f078d294689f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.745719 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a574a830-e183-484d-a06c-660b14b93539-config-data\") pod \"a574a830-e183-484d-a06c-660b14b93539\" (UID: \"a574a830-e183-484d-a06c-660b14b93539\") " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.746034 4704 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92b220a0-eefb-435c-92e7-f078d294689f-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.746046 4704 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.746057 4704 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-run\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.746066 4704 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a574a830-e183-484d-a06c-660b14b93539-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.746093 4704 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.746104 4704 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-sys\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.746112 4704 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-run\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.746120 4704 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92b220a0-eefb-435c-92e7-f078d294689f-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.746128 4704 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.746138 4704 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92b220a0-eefb-435c-92e7-f078d294689f-logs\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.746146 4704 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-sys\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.746154 4704 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-dev\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.746174 4704 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.746190 4704 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a574a830-e183-484d-a06c-660b14b93539-logs\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.746201 4704 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.746221 4704 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.746235 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6562k\" (UniqueName: \"kubernetes.io/projected/a574a830-e183-484d-a06c-660b14b93539-kube-api-access-6562k\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.746246 4704 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.746255 4704 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a574a830-e183-484d-a06c-660b14b93539-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.746263 4704 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-dev\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.746271 4704 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a574a830-e183-484d-a06c-660b14b93539-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.746279 4704 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.746293 4704 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.746304 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2rkd\" (UniqueName: \"kubernetes.io/projected/92b220a0-eefb-435c-92e7-f078d294689f-kube-api-access-f2rkd\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.746312 4704 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/92b220a0-eefb-435c-92e7-f078d294689f-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.760020 4704 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.760241 4704 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.760988 4704 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.764356 4704 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.785153 4704 generic.go:334] "Generic (PLEG): container finished" podID="92b220a0-eefb-435c-92e7-f078d294689f" containerID="f8a3fa677ec9aedeb7ea9bb40f61d3dcca235bcc00252324cb128c1b2609ee71" exitCode=0 Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.785241 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"92b220a0-eefb-435c-92e7-f078d294689f","Type":"ContainerDied","Data":"f8a3fa677ec9aedeb7ea9bb40f61d3dcca235bcc00252324cb128c1b2609ee71"} Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.785254 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.785298 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"92b220a0-eefb-435c-92e7-f078d294689f","Type":"ContainerDied","Data":"b272ab752c32222bda2309162307ac74f1d05e7cdd54b872c972546359cf5b09"} Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.785346 4704 scope.go:117] "RemoveContainer" containerID="f8a3fa677ec9aedeb7ea9bb40f61d3dcca235bcc00252324cb128c1b2609ee71" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.787186 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a574a830-e183-484d-a06c-660b14b93539-config-data" (OuterVolumeSpecName: "config-data") pod "a574a830-e183-484d-a06c-660b14b93539" (UID: "a574a830-e183-484d-a06c-660b14b93539"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.789104 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance50cb-account-delete-j24ks" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.789105 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance50cb-account-delete-j24ks" event={"ID":"f7ea419f-3967-4d59-8dea-9fc4fa003b75","Type":"ContainerDied","Data":"eb26caad517002fdafd55bdaeda913fd06ba97a8de6eeb6bcd78b9aa73c70122"} Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.789144 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb26caad517002fdafd55bdaeda913fd06ba97a8de6eeb6bcd78b9aa73c70122" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.792030 4704 generic.go:334] "Generic (PLEG): container finished" podID="a574a830-e183-484d-a06c-660b14b93539" containerID="1d3eda75f1e6c9474bd9dcaddcb0a430dae06ef3d9e46fcc5d627b5ee06ff52e" exitCode=0 Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.792062 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a574a830-e183-484d-a06c-660b14b93539","Type":"ContainerDied","Data":"1d3eda75f1e6c9474bd9dcaddcb0a430dae06ef3d9e46fcc5d627b5ee06ff52e"} Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.792082 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a574a830-e183-484d-a06c-660b14b93539","Type":"ContainerDied","Data":"d05285c634a9735020cb61e954d886e67d7af77fa49d57088fd81ce5476fbed7"} Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.792117 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.820588 4704 scope.go:117] "RemoveContainer" containerID="f5be42917874dd904adde8e131e4aaec272ff241df8f5034c49efd4e4e0b1192" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.830411 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.844383 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.846932 4704 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.846982 4704 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.846991 4704 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.847000 4704 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.847011 4704 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a574a830-e183-484d-a06c-660b14b93539-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.850044 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.850772 4704 scope.go:117] "RemoveContainer" containerID="f8a3fa677ec9aedeb7ea9bb40f61d3dcca235bcc00252324cb128c1b2609ee71" Nov 25 15:55:26 crc kubenswrapper[4704]: E1125 15:55:26.851285 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8a3fa677ec9aedeb7ea9bb40f61d3dcca235bcc00252324cb128c1b2609ee71\": container with ID starting with f8a3fa677ec9aedeb7ea9bb40f61d3dcca235bcc00252324cb128c1b2609ee71 not found: ID does not exist" containerID="f8a3fa677ec9aedeb7ea9bb40f61d3dcca235bcc00252324cb128c1b2609ee71" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.851357 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8a3fa677ec9aedeb7ea9bb40f61d3dcca235bcc00252324cb128c1b2609ee71"} err="failed to get container status \"f8a3fa677ec9aedeb7ea9bb40f61d3dcca235bcc00252324cb128c1b2609ee71\": rpc error: code = NotFound desc = could not find container \"f8a3fa677ec9aedeb7ea9bb40f61d3dcca235bcc00252324cb128c1b2609ee71\": container with ID starting with f8a3fa677ec9aedeb7ea9bb40f61d3dcca235bcc00252324cb128c1b2609ee71 not found: ID does not exist" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.851385 4704 scope.go:117] "RemoveContainer" containerID="f5be42917874dd904adde8e131e4aaec272ff241df8f5034c49efd4e4e0b1192" Nov 25 15:55:26 crc kubenswrapper[4704]: E1125 15:55:26.852122 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5be42917874dd904adde8e131e4aaec272ff241df8f5034c49efd4e4e0b1192\": container with ID starting with f5be42917874dd904adde8e131e4aaec272ff241df8f5034c49efd4e4e0b1192 not found: ID does not exist" containerID="f5be42917874dd904adde8e131e4aaec272ff241df8f5034c49efd4e4e0b1192" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.852159 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5be42917874dd904adde8e131e4aaec272ff241df8f5034c49efd4e4e0b1192"} err="failed to get container status \"f5be42917874dd904adde8e131e4aaec272ff241df8f5034c49efd4e4e0b1192\": rpc error: code = NotFound desc = could not find container \"f5be42917874dd904adde8e131e4aaec272ff241df8f5034c49efd4e4e0b1192\": container with ID starting with f5be42917874dd904adde8e131e4aaec272ff241df8f5034c49efd4e4e0b1192 not found: ID does not exist" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.852187 4704 scope.go:117] "RemoveContainer" containerID="1d3eda75f1e6c9474bd9dcaddcb0a430dae06ef3d9e46fcc5d627b5ee06ff52e" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.858137 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.877647 4704 scope.go:117] "RemoveContainer" containerID="d83a2ad6a169d38a648b6e361411654693f6dfad1526111757c1a3e0537f540e" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.894493 4704 scope.go:117] "RemoveContainer" containerID="1d3eda75f1e6c9474bd9dcaddcb0a430dae06ef3d9e46fcc5d627b5ee06ff52e" Nov 25 15:55:26 crc kubenswrapper[4704]: E1125 15:55:26.895001 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d3eda75f1e6c9474bd9dcaddcb0a430dae06ef3d9e46fcc5d627b5ee06ff52e\": container with ID starting with 1d3eda75f1e6c9474bd9dcaddcb0a430dae06ef3d9e46fcc5d627b5ee06ff52e not found: ID does not exist" containerID="1d3eda75f1e6c9474bd9dcaddcb0a430dae06ef3d9e46fcc5d627b5ee06ff52e" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.895114 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d3eda75f1e6c9474bd9dcaddcb0a430dae06ef3d9e46fcc5d627b5ee06ff52e"} err="failed to get container status \"1d3eda75f1e6c9474bd9dcaddcb0a430dae06ef3d9e46fcc5d627b5ee06ff52e\": rpc error: code = NotFound desc = could not find container \"1d3eda75f1e6c9474bd9dcaddcb0a430dae06ef3d9e46fcc5d627b5ee06ff52e\": container with ID starting with 1d3eda75f1e6c9474bd9dcaddcb0a430dae06ef3d9e46fcc5d627b5ee06ff52e not found: ID does not exist" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.895200 4704 scope.go:117] "RemoveContainer" containerID="d83a2ad6a169d38a648b6e361411654693f6dfad1526111757c1a3e0537f540e" Nov 25 15:55:26 crc kubenswrapper[4704]: E1125 15:55:26.895555 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d83a2ad6a169d38a648b6e361411654693f6dfad1526111757c1a3e0537f540e\": container with ID starting with d83a2ad6a169d38a648b6e361411654693f6dfad1526111757c1a3e0537f540e not found: ID does not exist" containerID="d83a2ad6a169d38a648b6e361411654693f6dfad1526111757c1a3e0537f540e" Nov 25 15:55:26 crc kubenswrapper[4704]: I1125 15:55:26.895602 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d83a2ad6a169d38a648b6e361411654693f6dfad1526111757c1a3e0537f540e"} err="failed to get container status \"d83a2ad6a169d38a648b6e361411654693f6dfad1526111757c1a3e0537f540e\": rpc error: code = NotFound desc = could not find container \"d83a2ad6a169d38a648b6e361411654693f6dfad1526111757c1a3e0537f540e\": container with ID starting with d83a2ad6a169d38a648b6e361411654693f6dfad1526111757c1a3e0537f540e not found: ID does not exist" Nov 25 15:55:28 crc kubenswrapper[4704]: I1125 15:55:28.000387 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-6nxcq"] Nov 25 15:55:28 crc kubenswrapper[4704]: I1125 15:55:28.005642 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-6nxcq"] Nov 25 15:55:28 crc kubenswrapper[4704]: I1125 15:55:28.040311 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-50cb-account-create-update-t2pzh"] Nov 25 15:55:28 crc kubenswrapper[4704]: I1125 15:55:28.046952 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance50cb-account-delete-j24ks"] Nov 25 15:55:28 crc kubenswrapper[4704]: I1125 15:55:28.051934 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-50cb-account-create-update-t2pzh"] Nov 25 15:55:28 crc kubenswrapper[4704]: I1125 15:55:28.057466 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance50cb-account-delete-j24ks"] Nov 25 15:55:28 crc kubenswrapper[4704]: I1125 15:55:28.423677 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7391f8a5-e178-4c42-9845-015504a91779" path="/var/lib/kubelet/pods/7391f8a5-e178-4c42-9845-015504a91779/volumes" Nov 25 15:55:28 crc kubenswrapper[4704]: I1125 15:55:28.424555 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92b220a0-eefb-435c-92e7-f078d294689f" path="/var/lib/kubelet/pods/92b220a0-eefb-435c-92e7-f078d294689f/volumes" Nov 25 15:55:28 crc kubenswrapper[4704]: I1125 15:55:28.425151 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a574a830-e183-484d-a06c-660b14b93539" path="/var/lib/kubelet/pods/a574a830-e183-484d-a06c-660b14b93539/volumes" Nov 25 15:55:28 crc kubenswrapper[4704]: I1125 15:55:28.426183 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="daf1e165-d977-4aa8-a214-d9fa6552ec18" path="/var/lib/kubelet/pods/daf1e165-d977-4aa8-a214-d9fa6552ec18/volumes" Nov 25 15:55:28 crc kubenswrapper[4704]: I1125 15:55:28.426662 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7ea419f-3967-4d59-8dea-9fc4fa003b75" path="/var/lib/kubelet/pods/f7ea419f-3967-4d59-8dea-9fc4fa003b75/volumes" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.301243 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-n9llf"] Nov 25 15:55:29 crc kubenswrapper[4704]: E1125 15:55:29.301928 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92b220a0-eefb-435c-92e7-f078d294689f" containerName="glance-httpd" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.301943 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="92b220a0-eefb-435c-92e7-f078d294689f" containerName="glance-httpd" Nov 25 15:55:29 crc kubenswrapper[4704]: E1125 15:55:29.301955 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f5fd1d0-8bba-442f-a35c-15cfcf49a92f" containerName="openstackclient" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.301964 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f5fd1d0-8bba-442f-a35c-15cfcf49a92f" containerName="openstackclient" Nov 25 15:55:29 crc kubenswrapper[4704]: E1125 15:55:29.301973 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a574a830-e183-484d-a06c-660b14b93539" containerName="glance-httpd" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.301980 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="a574a830-e183-484d-a06c-660b14b93539" containerName="glance-httpd" Nov 25 15:55:29 crc kubenswrapper[4704]: E1125 15:55:29.301995 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ea419f-3967-4d59-8dea-9fc4fa003b75" containerName="mariadb-account-delete" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.302001 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ea419f-3967-4d59-8dea-9fc4fa003b75" containerName="mariadb-account-delete" Nov 25 15:55:29 crc kubenswrapper[4704]: E1125 15:55:29.302023 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92b220a0-eefb-435c-92e7-f078d294689f" containerName="glance-log" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.302029 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="92b220a0-eefb-435c-92e7-f078d294689f" containerName="glance-log" Nov 25 15:55:29 crc kubenswrapper[4704]: E1125 15:55:29.302045 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a574a830-e183-484d-a06c-660b14b93539" containerName="glance-log" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.302050 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="a574a830-e183-484d-a06c-660b14b93539" containerName="glance-log" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.302191 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="a574a830-e183-484d-a06c-660b14b93539" containerName="glance-log" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.302204 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="a574a830-e183-484d-a06c-660b14b93539" containerName="glance-httpd" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.302216 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f5fd1d0-8bba-442f-a35c-15cfcf49a92f" containerName="openstackclient" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.302224 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="92b220a0-eefb-435c-92e7-f078d294689f" containerName="glance-httpd" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.302231 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7ea419f-3967-4d59-8dea-9fc4fa003b75" containerName="mariadb-account-delete" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.302240 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="92b220a0-eefb-435c-92e7-f078d294689f" containerName="glance-log" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.302852 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-n9llf" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.325738 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-n9llf"] Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.382138 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4h4r\" (UniqueName: \"kubernetes.io/projected/bb80a0aa-905e-4823-bf03-0880fce6d7f0-kube-api-access-w4h4r\") pod \"glance-db-create-n9llf\" (UID: \"bb80a0aa-905e-4823-bf03-0880fce6d7f0\") " pod="glance-kuttl-tests/glance-db-create-n9llf" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.382225 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb80a0aa-905e-4823-bf03-0880fce6d7f0-operator-scripts\") pod \"glance-db-create-n9llf\" (UID: \"bb80a0aa-905e-4823-bf03-0880fce6d7f0\") " pod="glance-kuttl-tests/glance-db-create-n9llf" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.484120 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4h4r\" (UniqueName: \"kubernetes.io/projected/bb80a0aa-905e-4823-bf03-0880fce6d7f0-kube-api-access-w4h4r\") pod \"glance-db-create-n9llf\" (UID: \"bb80a0aa-905e-4823-bf03-0880fce6d7f0\") " pod="glance-kuttl-tests/glance-db-create-n9llf" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.484228 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb80a0aa-905e-4823-bf03-0880fce6d7f0-operator-scripts\") pod \"glance-db-create-n9llf\" (UID: \"bb80a0aa-905e-4823-bf03-0880fce6d7f0\") " pod="glance-kuttl-tests/glance-db-create-n9llf" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.485553 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb80a0aa-905e-4823-bf03-0880fce6d7f0-operator-scripts\") pod \"glance-db-create-n9llf\" (UID: \"bb80a0aa-905e-4823-bf03-0880fce6d7f0\") " pod="glance-kuttl-tests/glance-db-create-n9llf" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.501100 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-3787-account-create-update-w6j7v"] Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.502125 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3787-account-create-update-w6j7v" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.504663 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.510880 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4h4r\" (UniqueName: \"kubernetes.io/projected/bb80a0aa-905e-4823-bf03-0880fce6d7f0-kube-api-access-w4h4r\") pod \"glance-db-create-n9llf\" (UID: \"bb80a0aa-905e-4823-bf03-0880fce6d7f0\") " pod="glance-kuttl-tests/glance-db-create-n9llf" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.514363 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-3787-account-create-update-w6j7v"] Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.626659 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-n9llf" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.692605 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81ed81c2-d941-47cf-822b-f3e518a9cd4c-operator-scripts\") pod \"glance-3787-account-create-update-w6j7v\" (UID: \"81ed81c2-d941-47cf-822b-f3e518a9cd4c\") " pod="glance-kuttl-tests/glance-3787-account-create-update-w6j7v" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.692831 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krftc\" (UniqueName: \"kubernetes.io/projected/81ed81c2-d941-47cf-822b-f3e518a9cd4c-kube-api-access-krftc\") pod \"glance-3787-account-create-update-w6j7v\" (UID: \"81ed81c2-d941-47cf-822b-f3e518a9cd4c\") " pod="glance-kuttl-tests/glance-3787-account-create-update-w6j7v" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.794714 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krftc\" (UniqueName: \"kubernetes.io/projected/81ed81c2-d941-47cf-822b-f3e518a9cd4c-kube-api-access-krftc\") pod \"glance-3787-account-create-update-w6j7v\" (UID: \"81ed81c2-d941-47cf-822b-f3e518a9cd4c\") " pod="glance-kuttl-tests/glance-3787-account-create-update-w6j7v" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.794845 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81ed81c2-d941-47cf-822b-f3e518a9cd4c-operator-scripts\") pod \"glance-3787-account-create-update-w6j7v\" (UID: \"81ed81c2-d941-47cf-822b-f3e518a9cd4c\") " pod="glance-kuttl-tests/glance-3787-account-create-update-w6j7v" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.795973 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81ed81c2-d941-47cf-822b-f3e518a9cd4c-operator-scripts\") pod \"glance-3787-account-create-update-w6j7v\" (UID: \"81ed81c2-d941-47cf-822b-f3e518a9cd4c\") " pod="glance-kuttl-tests/glance-3787-account-create-update-w6j7v" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.817989 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krftc\" (UniqueName: \"kubernetes.io/projected/81ed81c2-d941-47cf-822b-f3e518a9cd4c-kube-api-access-krftc\") pod \"glance-3787-account-create-update-w6j7v\" (UID: \"81ed81c2-d941-47cf-822b-f3e518a9cd4c\") " pod="glance-kuttl-tests/glance-3787-account-create-update-w6j7v" Nov 25 15:55:29 crc kubenswrapper[4704]: I1125 15:55:29.848283 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3787-account-create-update-w6j7v" Nov 25 15:55:30 crc kubenswrapper[4704]: I1125 15:55:30.080924 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-n9llf"] Nov 25 15:55:30 crc kubenswrapper[4704]: I1125 15:55:30.087193 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-3787-account-create-update-w6j7v"] Nov 25 15:55:30 crc kubenswrapper[4704]: E1125 15:55:30.684925 4704 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81ed81c2_d941_47cf_822b_f3e518a9cd4c.slice/crio-76cbce8b6db02f840316a957d781a02904a4ed2d9f43981b170ec21c24f5569c.scope\": RecentStats: unable to find data in memory cache]" Nov 25 15:55:30 crc kubenswrapper[4704]: I1125 15:55:30.829339 4704 generic.go:334] "Generic (PLEG): container finished" podID="bb80a0aa-905e-4823-bf03-0880fce6d7f0" containerID="1fac7abea0b1b3522f33fe7c95ad238db03e6bfb8543443aa0a11d2474e6fe7d" exitCode=0 Nov 25 15:55:30 crc kubenswrapper[4704]: I1125 15:55:30.829711 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-n9llf" event={"ID":"bb80a0aa-905e-4823-bf03-0880fce6d7f0","Type":"ContainerDied","Data":"1fac7abea0b1b3522f33fe7c95ad238db03e6bfb8543443aa0a11d2474e6fe7d"} Nov 25 15:55:30 crc kubenswrapper[4704]: I1125 15:55:30.829961 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-n9llf" event={"ID":"bb80a0aa-905e-4823-bf03-0880fce6d7f0","Type":"ContainerStarted","Data":"3cb8868e6b714a13faf7133408d9781801e2d705477d60f47afa6994e2142021"} Nov 25 15:55:30 crc kubenswrapper[4704]: I1125 15:55:30.831620 4704 generic.go:334] "Generic (PLEG): container finished" podID="81ed81c2-d941-47cf-822b-f3e518a9cd4c" containerID="76cbce8b6db02f840316a957d781a02904a4ed2d9f43981b170ec21c24f5569c" exitCode=0 Nov 25 15:55:30 crc kubenswrapper[4704]: I1125 15:55:30.831827 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-3787-account-create-update-w6j7v" event={"ID":"81ed81c2-d941-47cf-822b-f3e518a9cd4c","Type":"ContainerDied","Data":"76cbce8b6db02f840316a957d781a02904a4ed2d9f43981b170ec21c24f5569c"} Nov 25 15:55:30 crc kubenswrapper[4704]: I1125 15:55:30.832650 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-3787-account-create-update-w6j7v" event={"ID":"81ed81c2-d941-47cf-822b-f3e518a9cd4c","Type":"ContainerStarted","Data":"c54d572a9934d1c6985adb8fb482a1b38333c3a00524d3f2edd5abeedc0b442d"} Nov 25 15:55:32 crc kubenswrapper[4704]: I1125 15:55:32.266755 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-n9llf" Nov 25 15:55:32 crc kubenswrapper[4704]: I1125 15:55:32.275630 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3787-account-create-update-w6j7v" Nov 25 15:55:32 crc kubenswrapper[4704]: I1125 15:55:32.446955 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krftc\" (UniqueName: \"kubernetes.io/projected/81ed81c2-d941-47cf-822b-f3e518a9cd4c-kube-api-access-krftc\") pod \"81ed81c2-d941-47cf-822b-f3e518a9cd4c\" (UID: \"81ed81c2-d941-47cf-822b-f3e518a9cd4c\") " Nov 25 15:55:32 crc kubenswrapper[4704]: I1125 15:55:32.447039 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81ed81c2-d941-47cf-822b-f3e518a9cd4c-operator-scripts\") pod \"81ed81c2-d941-47cf-822b-f3e518a9cd4c\" (UID: \"81ed81c2-d941-47cf-822b-f3e518a9cd4c\") " Nov 25 15:55:32 crc kubenswrapper[4704]: I1125 15:55:32.447072 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4h4r\" (UniqueName: \"kubernetes.io/projected/bb80a0aa-905e-4823-bf03-0880fce6d7f0-kube-api-access-w4h4r\") pod \"bb80a0aa-905e-4823-bf03-0880fce6d7f0\" (UID: \"bb80a0aa-905e-4823-bf03-0880fce6d7f0\") " Nov 25 15:55:32 crc kubenswrapper[4704]: I1125 15:55:32.447104 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb80a0aa-905e-4823-bf03-0880fce6d7f0-operator-scripts\") pod \"bb80a0aa-905e-4823-bf03-0880fce6d7f0\" (UID: \"bb80a0aa-905e-4823-bf03-0880fce6d7f0\") " Nov 25 15:55:32 crc kubenswrapper[4704]: I1125 15:55:32.448096 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81ed81c2-d941-47cf-822b-f3e518a9cd4c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "81ed81c2-d941-47cf-822b-f3e518a9cd4c" (UID: "81ed81c2-d941-47cf-822b-f3e518a9cd4c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:55:32 crc kubenswrapper[4704]: I1125 15:55:32.448231 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb80a0aa-905e-4823-bf03-0880fce6d7f0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bb80a0aa-905e-4823-bf03-0880fce6d7f0" (UID: "bb80a0aa-905e-4823-bf03-0880fce6d7f0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:55:32 crc kubenswrapper[4704]: I1125 15:55:32.453517 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb80a0aa-905e-4823-bf03-0880fce6d7f0-kube-api-access-w4h4r" (OuterVolumeSpecName: "kube-api-access-w4h4r") pod "bb80a0aa-905e-4823-bf03-0880fce6d7f0" (UID: "bb80a0aa-905e-4823-bf03-0880fce6d7f0"). InnerVolumeSpecName "kube-api-access-w4h4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:55:32 crc kubenswrapper[4704]: I1125 15:55:32.454300 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81ed81c2-d941-47cf-822b-f3e518a9cd4c-kube-api-access-krftc" (OuterVolumeSpecName: "kube-api-access-krftc") pod "81ed81c2-d941-47cf-822b-f3e518a9cd4c" (UID: "81ed81c2-d941-47cf-822b-f3e518a9cd4c"). InnerVolumeSpecName "kube-api-access-krftc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:55:32 crc kubenswrapper[4704]: I1125 15:55:32.551524 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4h4r\" (UniqueName: \"kubernetes.io/projected/bb80a0aa-905e-4823-bf03-0880fce6d7f0-kube-api-access-w4h4r\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:32 crc kubenswrapper[4704]: I1125 15:55:32.551563 4704 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb80a0aa-905e-4823-bf03-0880fce6d7f0-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:32 crc kubenswrapper[4704]: I1125 15:55:32.551581 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krftc\" (UniqueName: \"kubernetes.io/projected/81ed81c2-d941-47cf-822b-f3e518a9cd4c-kube-api-access-krftc\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:32 crc kubenswrapper[4704]: I1125 15:55:32.551594 4704 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81ed81c2-d941-47cf-822b-f3e518a9cd4c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:32 crc kubenswrapper[4704]: I1125 15:55:32.850138 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-n9llf" event={"ID":"bb80a0aa-905e-4823-bf03-0880fce6d7f0","Type":"ContainerDied","Data":"3cb8868e6b714a13faf7133408d9781801e2d705477d60f47afa6994e2142021"} Nov 25 15:55:32 crc kubenswrapper[4704]: I1125 15:55:32.850171 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-n9llf" Nov 25 15:55:32 crc kubenswrapper[4704]: I1125 15:55:32.850209 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cb8868e6b714a13faf7133408d9781801e2d705477d60f47afa6994e2142021" Nov 25 15:55:32 crc kubenswrapper[4704]: I1125 15:55:32.852024 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-3787-account-create-update-w6j7v" event={"ID":"81ed81c2-d941-47cf-822b-f3e518a9cd4c","Type":"ContainerDied","Data":"c54d572a9934d1c6985adb8fb482a1b38333c3a00524d3f2edd5abeedc0b442d"} Nov 25 15:55:32 crc kubenswrapper[4704]: I1125 15:55:32.852052 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c54d572a9934d1c6985adb8fb482a1b38333c3a00524d3f2edd5abeedc0b442d" Nov 25 15:55:32 crc kubenswrapper[4704]: I1125 15:55:32.852097 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3787-account-create-update-w6j7v" Nov 25 15:55:34 crc kubenswrapper[4704]: I1125 15:55:34.608355 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-8dmzk"] Nov 25 15:55:34 crc kubenswrapper[4704]: E1125 15:55:34.609115 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb80a0aa-905e-4823-bf03-0880fce6d7f0" containerName="mariadb-database-create" Nov 25 15:55:34 crc kubenswrapper[4704]: I1125 15:55:34.609129 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb80a0aa-905e-4823-bf03-0880fce6d7f0" containerName="mariadb-database-create" Nov 25 15:55:34 crc kubenswrapper[4704]: E1125 15:55:34.609140 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81ed81c2-d941-47cf-822b-f3e518a9cd4c" containerName="mariadb-account-create-update" Nov 25 15:55:34 crc kubenswrapper[4704]: I1125 15:55:34.609148 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="81ed81c2-d941-47cf-822b-f3e518a9cd4c" containerName="mariadb-account-create-update" Nov 25 15:55:34 crc kubenswrapper[4704]: I1125 15:55:34.609337 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="81ed81c2-d941-47cf-822b-f3e518a9cd4c" containerName="mariadb-account-create-update" Nov 25 15:55:34 crc kubenswrapper[4704]: I1125 15:55:34.609355 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb80a0aa-905e-4823-bf03-0880fce6d7f0" containerName="mariadb-database-create" Nov 25 15:55:34 crc kubenswrapper[4704]: I1125 15:55:34.610104 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-8dmzk" Nov 25 15:55:34 crc kubenswrapper[4704]: I1125 15:55:34.612845 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Nov 25 15:55:34 crc kubenswrapper[4704]: I1125 15:55:34.613071 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Nov 25 15:55:34 crc kubenswrapper[4704]: I1125 15:55:34.613693 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-87jxp" Nov 25 15:55:34 crc kubenswrapper[4704]: I1125 15:55:34.655550 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-8dmzk"] Nov 25 15:55:34 crc kubenswrapper[4704]: I1125 15:55:34.687328 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/488620be-e832-4e72-9a76-57ce869a39fc-db-sync-config-data\") pod \"glance-db-sync-8dmzk\" (UID: \"488620be-e832-4e72-9a76-57ce869a39fc\") " pod="glance-kuttl-tests/glance-db-sync-8dmzk" Nov 25 15:55:34 crc kubenswrapper[4704]: I1125 15:55:34.687478 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/488620be-e832-4e72-9a76-57ce869a39fc-combined-ca-bundle\") pod \"glance-db-sync-8dmzk\" (UID: \"488620be-e832-4e72-9a76-57ce869a39fc\") " pod="glance-kuttl-tests/glance-db-sync-8dmzk" Nov 25 15:55:34 crc kubenswrapper[4704]: I1125 15:55:34.687510 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57rt6\" (UniqueName: \"kubernetes.io/projected/488620be-e832-4e72-9a76-57ce869a39fc-kube-api-access-57rt6\") pod \"glance-db-sync-8dmzk\" (UID: \"488620be-e832-4e72-9a76-57ce869a39fc\") " pod="glance-kuttl-tests/glance-db-sync-8dmzk" Nov 25 15:55:34 crc kubenswrapper[4704]: I1125 15:55:34.687538 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/488620be-e832-4e72-9a76-57ce869a39fc-config-data\") pod \"glance-db-sync-8dmzk\" (UID: \"488620be-e832-4e72-9a76-57ce869a39fc\") " pod="glance-kuttl-tests/glance-db-sync-8dmzk" Nov 25 15:55:34 crc kubenswrapper[4704]: I1125 15:55:34.789595 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/488620be-e832-4e72-9a76-57ce869a39fc-db-sync-config-data\") pod \"glance-db-sync-8dmzk\" (UID: \"488620be-e832-4e72-9a76-57ce869a39fc\") " pod="glance-kuttl-tests/glance-db-sync-8dmzk" Nov 25 15:55:34 crc kubenswrapper[4704]: I1125 15:55:34.789916 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/488620be-e832-4e72-9a76-57ce869a39fc-combined-ca-bundle\") pod \"glance-db-sync-8dmzk\" (UID: \"488620be-e832-4e72-9a76-57ce869a39fc\") " pod="glance-kuttl-tests/glance-db-sync-8dmzk" Nov 25 15:55:34 crc kubenswrapper[4704]: I1125 15:55:34.790067 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57rt6\" (UniqueName: \"kubernetes.io/projected/488620be-e832-4e72-9a76-57ce869a39fc-kube-api-access-57rt6\") pod \"glance-db-sync-8dmzk\" (UID: \"488620be-e832-4e72-9a76-57ce869a39fc\") " pod="glance-kuttl-tests/glance-db-sync-8dmzk" Nov 25 15:55:34 crc kubenswrapper[4704]: I1125 15:55:34.790142 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/488620be-e832-4e72-9a76-57ce869a39fc-config-data\") pod \"glance-db-sync-8dmzk\" (UID: \"488620be-e832-4e72-9a76-57ce869a39fc\") " pod="glance-kuttl-tests/glance-db-sync-8dmzk" Nov 25 15:55:34 crc kubenswrapper[4704]: I1125 15:55:34.806962 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/488620be-e832-4e72-9a76-57ce869a39fc-combined-ca-bundle\") pod \"glance-db-sync-8dmzk\" (UID: \"488620be-e832-4e72-9a76-57ce869a39fc\") " pod="glance-kuttl-tests/glance-db-sync-8dmzk" Nov 25 15:55:34 crc kubenswrapper[4704]: I1125 15:55:34.807018 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/488620be-e832-4e72-9a76-57ce869a39fc-config-data\") pod \"glance-db-sync-8dmzk\" (UID: \"488620be-e832-4e72-9a76-57ce869a39fc\") " pod="glance-kuttl-tests/glance-db-sync-8dmzk" Nov 25 15:55:34 crc kubenswrapper[4704]: I1125 15:55:34.807676 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/488620be-e832-4e72-9a76-57ce869a39fc-db-sync-config-data\") pod \"glance-db-sync-8dmzk\" (UID: \"488620be-e832-4e72-9a76-57ce869a39fc\") " pod="glance-kuttl-tests/glance-db-sync-8dmzk" Nov 25 15:55:34 crc kubenswrapper[4704]: I1125 15:55:34.809654 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57rt6\" (UniqueName: \"kubernetes.io/projected/488620be-e832-4e72-9a76-57ce869a39fc-kube-api-access-57rt6\") pod \"glance-db-sync-8dmzk\" (UID: \"488620be-e832-4e72-9a76-57ce869a39fc\") " pod="glance-kuttl-tests/glance-db-sync-8dmzk" Nov 25 15:55:34 crc kubenswrapper[4704]: I1125 15:55:34.935463 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-8dmzk" Nov 25 15:55:35 crc kubenswrapper[4704]: I1125 15:55:35.399594 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-8dmzk"] Nov 25 15:55:35 crc kubenswrapper[4704]: I1125 15:55:35.879032 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-8dmzk" event={"ID":"488620be-e832-4e72-9a76-57ce869a39fc","Type":"ContainerStarted","Data":"56518f253ce2b747e0e446f16f45a47931a362feffa57a39d8f73c23f84c3ed4"} Nov 25 15:55:36 crc kubenswrapper[4704]: I1125 15:55:36.889235 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-8dmzk" event={"ID":"488620be-e832-4e72-9a76-57ce869a39fc","Type":"ContainerStarted","Data":"2b7df9b4018e475cfbd3a363c2ba12b1a4c5b4b33bd230ea9a0755f18b3ed8a9"} Nov 25 15:55:36 crc kubenswrapper[4704]: I1125 15:55:36.909721 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-8dmzk" podStartSLOduration=2.90970147 podStartE2EDuration="2.90970147s" podCreationTimestamp="2025-11-25 15:55:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:55:36.908408512 +0000 UTC m=+1223.176682313" watchObservedRunningTime="2025-11-25 15:55:36.90970147 +0000 UTC m=+1223.177975271" Nov 25 15:55:37 crc kubenswrapper[4704]: I1125 15:55:37.964355 4704 patch_prober.go:28] interesting pod/machine-config-daemon-djz8x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:55:37 crc kubenswrapper[4704]: I1125 15:55:37.964855 4704 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:55:39 crc kubenswrapper[4704]: I1125 15:55:39.924011 4704 generic.go:334] "Generic (PLEG): container finished" podID="488620be-e832-4e72-9a76-57ce869a39fc" containerID="2b7df9b4018e475cfbd3a363c2ba12b1a4c5b4b33bd230ea9a0755f18b3ed8a9" exitCode=0 Nov 25 15:55:39 crc kubenswrapper[4704]: I1125 15:55:39.924192 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-8dmzk" event={"ID":"488620be-e832-4e72-9a76-57ce869a39fc","Type":"ContainerDied","Data":"2b7df9b4018e475cfbd3a363c2ba12b1a4c5b4b33bd230ea9a0755f18b3ed8a9"} Nov 25 15:55:41 crc kubenswrapper[4704]: I1125 15:55:41.234292 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-8dmzk" Nov 25 15:55:41 crc kubenswrapper[4704]: I1125 15:55:41.394972 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/488620be-e832-4e72-9a76-57ce869a39fc-combined-ca-bundle\") pod \"488620be-e832-4e72-9a76-57ce869a39fc\" (UID: \"488620be-e832-4e72-9a76-57ce869a39fc\") " Nov 25 15:55:41 crc kubenswrapper[4704]: I1125 15:55:41.395012 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57rt6\" (UniqueName: \"kubernetes.io/projected/488620be-e832-4e72-9a76-57ce869a39fc-kube-api-access-57rt6\") pod \"488620be-e832-4e72-9a76-57ce869a39fc\" (UID: \"488620be-e832-4e72-9a76-57ce869a39fc\") " Nov 25 15:55:41 crc kubenswrapper[4704]: I1125 15:55:41.395072 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/488620be-e832-4e72-9a76-57ce869a39fc-db-sync-config-data\") pod \"488620be-e832-4e72-9a76-57ce869a39fc\" (UID: \"488620be-e832-4e72-9a76-57ce869a39fc\") " Nov 25 15:55:41 crc kubenswrapper[4704]: I1125 15:55:41.395142 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/488620be-e832-4e72-9a76-57ce869a39fc-config-data\") pod \"488620be-e832-4e72-9a76-57ce869a39fc\" (UID: \"488620be-e832-4e72-9a76-57ce869a39fc\") " Nov 25 15:55:41 crc kubenswrapper[4704]: I1125 15:55:41.401305 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/488620be-e832-4e72-9a76-57ce869a39fc-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "488620be-e832-4e72-9a76-57ce869a39fc" (UID: "488620be-e832-4e72-9a76-57ce869a39fc"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:55:41 crc kubenswrapper[4704]: I1125 15:55:41.401996 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/488620be-e832-4e72-9a76-57ce869a39fc-kube-api-access-57rt6" (OuterVolumeSpecName: "kube-api-access-57rt6") pod "488620be-e832-4e72-9a76-57ce869a39fc" (UID: "488620be-e832-4e72-9a76-57ce869a39fc"). InnerVolumeSpecName "kube-api-access-57rt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:55:41 crc kubenswrapper[4704]: I1125 15:55:41.419736 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/488620be-e832-4e72-9a76-57ce869a39fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "488620be-e832-4e72-9a76-57ce869a39fc" (UID: "488620be-e832-4e72-9a76-57ce869a39fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:55:41 crc kubenswrapper[4704]: I1125 15:55:41.435658 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/488620be-e832-4e72-9a76-57ce869a39fc-config-data" (OuterVolumeSpecName: "config-data") pod "488620be-e832-4e72-9a76-57ce869a39fc" (UID: "488620be-e832-4e72-9a76-57ce869a39fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:55:41 crc kubenswrapper[4704]: I1125 15:55:41.498514 4704 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/488620be-e832-4e72-9a76-57ce869a39fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:41 crc kubenswrapper[4704]: I1125 15:55:41.498583 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57rt6\" (UniqueName: \"kubernetes.io/projected/488620be-e832-4e72-9a76-57ce869a39fc-kube-api-access-57rt6\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:41 crc kubenswrapper[4704]: I1125 15:55:41.498598 4704 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/488620be-e832-4e72-9a76-57ce869a39fc-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:41 crc kubenswrapper[4704]: I1125 15:55:41.498608 4704 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/488620be-e832-4e72-9a76-57ce869a39fc-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:41 crc kubenswrapper[4704]: I1125 15:55:41.939399 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-8dmzk" event={"ID":"488620be-e832-4e72-9a76-57ce869a39fc","Type":"ContainerDied","Data":"56518f253ce2b747e0e446f16f45a47931a362feffa57a39d8f73c23f84c3ed4"} Nov 25 15:55:41 crc kubenswrapper[4704]: I1125 15:55:41.939451 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56518f253ce2b747e0e446f16f45a47931a362feffa57a39d8f73c23f84c3ed4" Nov 25 15:55:41 crc kubenswrapper[4704]: I1125 15:55:41.939522 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-8dmzk" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.315641 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 25 15:55:42 crc kubenswrapper[4704]: E1125 15:55:42.316057 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="488620be-e832-4e72-9a76-57ce869a39fc" containerName="glance-db-sync" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.316098 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="488620be-e832-4e72-9a76-57ce869a39fc" containerName="glance-db-sync" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.316481 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="488620be-e832-4e72-9a76-57ce869a39fc" containerName="glance-db-sync" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.317378 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.322159 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.322557 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.322760 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-87jxp" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.323143 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.323272 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-public-svc" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.323329 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-internal-svc" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.340586 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.514520 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.514578 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-single-0\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.514624 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.514679 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-scripts\") pod \"glance-default-single-0\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.514709 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-426dd\" (UniqueName: \"kubernetes.io/projected/c2a38f28-fd42-4cac-a986-65dd712eef57-kube-api-access-426dd\") pod \"glance-default-single-0\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.514728 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-config-data\") pod \"glance-default-single-0\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.514745 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c2a38f28-fd42-4cac-a986-65dd712eef57-httpd-run\") pod \"glance-default-single-0\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.514783 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2a38f28-fd42-4cac-a986-65dd712eef57-logs\") pod \"glance-default-single-0\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.514845 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.616607 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-single-0\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.616740 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.616823 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-scripts\") pod \"glance-default-single-0\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.616843 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-426dd\" (UniqueName: \"kubernetes.io/projected/c2a38f28-fd42-4cac-a986-65dd712eef57-kube-api-access-426dd\") pod \"glance-default-single-0\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.616867 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-config-data\") pod \"glance-default-single-0\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.616890 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c2a38f28-fd42-4cac-a986-65dd712eef57-httpd-run\") pod \"glance-default-single-0\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.616923 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2a38f28-fd42-4cac-a986-65dd712eef57-logs\") pod \"glance-default-single-0\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.616949 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.616976 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.617322 4704 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-single-0\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") device mount path \"/mnt/openstack/pv07\"" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.617824 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2a38f28-fd42-4cac-a986-65dd712eef57-logs\") pod \"glance-default-single-0\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.618093 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c2a38f28-fd42-4cac-a986-65dd712eef57-httpd-run\") pod \"glance-default-single-0\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.622040 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.622102 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.626980 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-config-data\") pod \"glance-default-single-0\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.628968 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-scripts\") pod \"glance-default-single-0\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.629259 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.638759 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-426dd\" (UniqueName: \"kubernetes.io/projected/c2a38f28-fd42-4cac-a986-65dd712eef57-kube-api-access-426dd\") pod \"glance-default-single-0\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.652132 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-single-0\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:42 crc kubenswrapper[4704]: I1125 15:55:42.934042 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:43 crc kubenswrapper[4704]: I1125 15:55:43.313895 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 25 15:55:43 crc kubenswrapper[4704]: I1125 15:55:43.541447 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 25 15:55:43 crc kubenswrapper[4704]: I1125 15:55:43.995090 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"c2a38f28-fd42-4cac-a986-65dd712eef57","Type":"ContainerStarted","Data":"05031d22fd1ab2667495b0629565e394669930e1c45b8e242b44998d6d4fdb58"} Nov 25 15:55:43 crc kubenswrapper[4704]: I1125 15:55:43.995573 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"c2a38f28-fd42-4cac-a986-65dd712eef57","Type":"ContainerStarted","Data":"35e3609dba478c2c9ad9ef2b7832cfb4321135f9cff672926c8ec3573bdd9eae"} Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.003162 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"c2a38f28-fd42-4cac-a986-65dd712eef57","Type":"ContainerStarted","Data":"d746a596a67b37e2231b0ab7c3e663696d30e03055dd7d1f6144b68c94ec625b"} Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.003300 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="c2a38f28-fd42-4cac-a986-65dd712eef57" containerName="glance-log" containerID="cri-o://05031d22fd1ab2667495b0629565e394669930e1c45b8e242b44998d6d4fdb58" gracePeriod=30 Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.003430 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="c2a38f28-fd42-4cac-a986-65dd712eef57" containerName="glance-httpd" containerID="cri-o://d746a596a67b37e2231b0ab7c3e663696d30e03055dd7d1f6144b68c94ec625b" gracePeriod=30 Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.030635 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=3.030608758 podStartE2EDuration="3.030608758s" podCreationTimestamp="2025-11-25 15:55:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:55:45.029743063 +0000 UTC m=+1231.298016864" watchObservedRunningTime="2025-11-25 15:55:45.030608758 +0000 UTC m=+1231.298882539" Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.525734 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.666517 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-combined-ca-bundle\") pod \"c2a38f28-fd42-4cac-a986-65dd712eef57\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.666616 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-scripts\") pod \"c2a38f28-fd42-4cac-a986-65dd712eef57\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.666676 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-public-tls-certs\") pod \"c2a38f28-fd42-4cac-a986-65dd712eef57\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.666715 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-config-data\") pod \"c2a38f28-fd42-4cac-a986-65dd712eef57\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.666782 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2a38f28-fd42-4cac-a986-65dd712eef57-logs\") pod \"c2a38f28-fd42-4cac-a986-65dd712eef57\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.666857 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-internal-tls-certs\") pod \"c2a38f28-fd42-4cac-a986-65dd712eef57\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.666934 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-426dd\" (UniqueName: \"kubernetes.io/projected/c2a38f28-fd42-4cac-a986-65dd712eef57-kube-api-access-426dd\") pod \"c2a38f28-fd42-4cac-a986-65dd712eef57\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.667599 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"c2a38f28-fd42-4cac-a986-65dd712eef57\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.667698 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c2a38f28-fd42-4cac-a986-65dd712eef57-httpd-run\") pod \"c2a38f28-fd42-4cac-a986-65dd712eef57\" (UID: \"c2a38f28-fd42-4cac-a986-65dd712eef57\") " Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.667752 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2a38f28-fd42-4cac-a986-65dd712eef57-logs" (OuterVolumeSpecName: "logs") pod "c2a38f28-fd42-4cac-a986-65dd712eef57" (UID: "c2a38f28-fd42-4cac-a986-65dd712eef57"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.668023 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2a38f28-fd42-4cac-a986-65dd712eef57-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c2a38f28-fd42-4cac-a986-65dd712eef57" (UID: "c2a38f28-fd42-4cac-a986-65dd712eef57"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.668300 4704 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c2a38f28-fd42-4cac-a986-65dd712eef57-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.668328 4704 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2a38f28-fd42-4cac-a986-65dd712eef57-logs\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.673894 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-scripts" (OuterVolumeSpecName: "scripts") pod "c2a38f28-fd42-4cac-a986-65dd712eef57" (UID: "c2a38f28-fd42-4cac-a986-65dd712eef57"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.678418 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2a38f28-fd42-4cac-a986-65dd712eef57-kube-api-access-426dd" (OuterVolumeSpecName: "kube-api-access-426dd") pod "c2a38f28-fd42-4cac-a986-65dd712eef57" (UID: "c2a38f28-fd42-4cac-a986-65dd712eef57"). InnerVolumeSpecName "kube-api-access-426dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.689084 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "c2a38f28-fd42-4cac-a986-65dd712eef57" (UID: "c2a38f28-fd42-4cac-a986-65dd712eef57"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.696668 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2a38f28-fd42-4cac-a986-65dd712eef57" (UID: "c2a38f28-fd42-4cac-a986-65dd712eef57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.712933 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-config-data" (OuterVolumeSpecName: "config-data") pod "c2a38f28-fd42-4cac-a986-65dd712eef57" (UID: "c2a38f28-fd42-4cac-a986-65dd712eef57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.713918 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c2a38f28-fd42-4cac-a986-65dd712eef57" (UID: "c2a38f28-fd42-4cac-a986-65dd712eef57"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.715246 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c2a38f28-fd42-4cac-a986-65dd712eef57" (UID: "c2a38f28-fd42-4cac-a986-65dd712eef57"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.769446 4704 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.769491 4704 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.769500 4704 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.769509 4704 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.769517 4704 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2a38f28-fd42-4cac-a986-65dd712eef57-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.769527 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-426dd\" (UniqueName: \"kubernetes.io/projected/c2a38f28-fd42-4cac-a986-65dd712eef57-kube-api-access-426dd\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.769562 4704 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.784505 4704 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Nov 25 15:55:45 crc kubenswrapper[4704]: I1125 15:55:45.871631 4704 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.011536 4704 generic.go:334] "Generic (PLEG): container finished" podID="c2a38f28-fd42-4cac-a986-65dd712eef57" containerID="d746a596a67b37e2231b0ab7c3e663696d30e03055dd7d1f6144b68c94ec625b" exitCode=0 Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.011566 4704 generic.go:334] "Generic (PLEG): container finished" podID="c2a38f28-fd42-4cac-a986-65dd712eef57" containerID="05031d22fd1ab2667495b0629565e394669930e1c45b8e242b44998d6d4fdb58" exitCode=143 Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.011589 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"c2a38f28-fd42-4cac-a986-65dd712eef57","Type":"ContainerDied","Data":"d746a596a67b37e2231b0ab7c3e663696d30e03055dd7d1f6144b68c94ec625b"} Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.011619 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"c2a38f28-fd42-4cac-a986-65dd712eef57","Type":"ContainerDied","Data":"05031d22fd1ab2667495b0629565e394669930e1c45b8e242b44998d6d4fdb58"} Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.011632 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"c2a38f28-fd42-4cac-a986-65dd712eef57","Type":"ContainerDied","Data":"35e3609dba478c2c9ad9ef2b7832cfb4321135f9cff672926c8ec3573bdd9eae"} Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.011651 4704 scope.go:117] "RemoveContainer" containerID="d746a596a67b37e2231b0ab7c3e663696d30e03055dd7d1f6144b68c94ec625b" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.011678 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.034095 4704 scope.go:117] "RemoveContainer" containerID="05031d22fd1ab2667495b0629565e394669930e1c45b8e242b44998d6d4fdb58" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.045464 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.051053 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.065296 4704 scope.go:117] "RemoveContainer" containerID="d746a596a67b37e2231b0ab7c3e663696d30e03055dd7d1f6144b68c94ec625b" Nov 25 15:55:46 crc kubenswrapper[4704]: E1125 15:55:46.069339 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d746a596a67b37e2231b0ab7c3e663696d30e03055dd7d1f6144b68c94ec625b\": container with ID starting with d746a596a67b37e2231b0ab7c3e663696d30e03055dd7d1f6144b68c94ec625b not found: ID does not exist" containerID="d746a596a67b37e2231b0ab7c3e663696d30e03055dd7d1f6144b68c94ec625b" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.069400 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d746a596a67b37e2231b0ab7c3e663696d30e03055dd7d1f6144b68c94ec625b"} err="failed to get container status \"d746a596a67b37e2231b0ab7c3e663696d30e03055dd7d1f6144b68c94ec625b\": rpc error: code = NotFound desc = could not find container \"d746a596a67b37e2231b0ab7c3e663696d30e03055dd7d1f6144b68c94ec625b\": container with ID starting with d746a596a67b37e2231b0ab7c3e663696d30e03055dd7d1f6144b68c94ec625b not found: ID does not exist" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.069446 4704 scope.go:117] "RemoveContainer" containerID="05031d22fd1ab2667495b0629565e394669930e1c45b8e242b44998d6d4fdb58" Nov 25 15:55:46 crc kubenswrapper[4704]: E1125 15:55:46.069741 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05031d22fd1ab2667495b0629565e394669930e1c45b8e242b44998d6d4fdb58\": container with ID starting with 05031d22fd1ab2667495b0629565e394669930e1c45b8e242b44998d6d4fdb58 not found: ID does not exist" containerID="05031d22fd1ab2667495b0629565e394669930e1c45b8e242b44998d6d4fdb58" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.069768 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05031d22fd1ab2667495b0629565e394669930e1c45b8e242b44998d6d4fdb58"} err="failed to get container status \"05031d22fd1ab2667495b0629565e394669930e1c45b8e242b44998d6d4fdb58\": rpc error: code = NotFound desc = could not find container \"05031d22fd1ab2667495b0629565e394669930e1c45b8e242b44998d6d4fdb58\": container with ID starting with 05031d22fd1ab2667495b0629565e394669930e1c45b8e242b44998d6d4fdb58 not found: ID does not exist" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.069785 4704 scope.go:117] "RemoveContainer" containerID="d746a596a67b37e2231b0ab7c3e663696d30e03055dd7d1f6144b68c94ec625b" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.070088 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d746a596a67b37e2231b0ab7c3e663696d30e03055dd7d1f6144b68c94ec625b"} err="failed to get container status \"d746a596a67b37e2231b0ab7c3e663696d30e03055dd7d1f6144b68c94ec625b\": rpc error: code = NotFound desc = could not find container \"d746a596a67b37e2231b0ab7c3e663696d30e03055dd7d1f6144b68c94ec625b\": container with ID starting with d746a596a67b37e2231b0ab7c3e663696d30e03055dd7d1f6144b68c94ec625b not found: ID does not exist" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.070109 4704 scope.go:117] "RemoveContainer" containerID="05031d22fd1ab2667495b0629565e394669930e1c45b8e242b44998d6d4fdb58" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.070407 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05031d22fd1ab2667495b0629565e394669930e1c45b8e242b44998d6d4fdb58"} err="failed to get container status \"05031d22fd1ab2667495b0629565e394669930e1c45b8e242b44998d6d4fdb58\": rpc error: code = NotFound desc = could not find container \"05031d22fd1ab2667495b0629565e394669930e1c45b8e242b44998d6d4fdb58\": container with ID starting with 05031d22fd1ab2667495b0629565e394669930e1c45b8e242b44998d6d4fdb58 not found: ID does not exist" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.072966 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 25 15:55:46 crc kubenswrapper[4704]: E1125 15:55:46.073306 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a38f28-fd42-4cac-a986-65dd712eef57" containerName="glance-httpd" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.073330 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a38f28-fd42-4cac-a986-65dd712eef57" containerName="glance-httpd" Nov 25 15:55:46 crc kubenswrapper[4704]: E1125 15:55:46.073357 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a38f28-fd42-4cac-a986-65dd712eef57" containerName="glance-log" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.073365 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a38f28-fd42-4cac-a986-65dd712eef57" containerName="glance-log" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.073520 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a38f28-fd42-4cac-a986-65dd712eef57" containerName="glance-log" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.073546 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a38f28-fd42-4cac-a986-65dd712eef57" containerName="glance-httpd" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.074784 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.077243 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-public-svc" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.077367 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.077582 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-internal-svc" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.077896 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.078072 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-87jxp" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.078222 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.091108 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.188407 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.188569 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.188607 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-config-data\") pod \"glance-default-single-0\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.188701 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2bvh\" (UniqueName: \"kubernetes.io/projected/ea901657-54a0-4000-8203-2b655bff8505-kube-api-access-h2bvh\") pod \"glance-default-single-0\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.188742 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea901657-54a0-4000-8203-2b655bff8505-logs\") pod \"glance-default-single-0\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.188852 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-scripts\") pod \"glance-default-single-0\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.188892 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.188930 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-single-0\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.188975 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ea901657-54a0-4000-8203-2b655bff8505-httpd-run\") pod \"glance-default-single-0\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.290204 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.290266 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.290293 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-config-data\") pod \"glance-default-single-0\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.290316 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2bvh\" (UniqueName: \"kubernetes.io/projected/ea901657-54a0-4000-8203-2b655bff8505-kube-api-access-h2bvh\") pod \"glance-default-single-0\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.290334 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea901657-54a0-4000-8203-2b655bff8505-logs\") pod \"glance-default-single-0\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.290364 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-scripts\") pod \"glance-default-single-0\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.290389 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.290419 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-single-0\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.290445 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ea901657-54a0-4000-8203-2b655bff8505-httpd-run\") pod \"glance-default-single-0\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.290959 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ea901657-54a0-4000-8203-2b655bff8505-httpd-run\") pod \"glance-default-single-0\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.291545 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea901657-54a0-4000-8203-2b655bff8505-logs\") pod \"glance-default-single-0\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.295945 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.296086 4704 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-single-0\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") device mount path \"/mnt/openstack/pv07\"" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.297281 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-config-data\") pod \"glance-default-single-0\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.377099 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2bvh\" (UniqueName: \"kubernetes.io/projected/ea901657-54a0-4000-8203-2b655bff8505-kube-api-access-h2bvh\") pod \"glance-default-single-0\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.380933 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.381006 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.382299 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-scripts\") pod \"glance-default-single-0\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.400737 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-single-0\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.427545 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2a38f28-fd42-4cac-a986-65dd712eef57" path="/var/lib/kubelet/pods/c2a38f28-fd42-4cac-a986-65dd712eef57/volumes" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.697277 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:46 crc kubenswrapper[4704]: I1125 15:55:46.934570 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 25 15:55:47 crc kubenswrapper[4704]: I1125 15:55:47.021541 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"ea901657-54a0-4000-8203-2b655bff8505","Type":"ContainerStarted","Data":"1a353823fefd9d6544c7291d6be204fe3161348060aef4c75b7f9043f81bae4a"} Nov 25 15:55:48 crc kubenswrapper[4704]: I1125 15:55:48.035105 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"ea901657-54a0-4000-8203-2b655bff8505","Type":"ContainerStarted","Data":"678aac0ae1fb12958716ee1f844bebe15d5d4ddf6265f88c7a7501bb1360d451"} Nov 25 15:55:48 crc kubenswrapper[4704]: I1125 15:55:48.035618 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"ea901657-54a0-4000-8203-2b655bff8505","Type":"ContainerStarted","Data":"6eb10cbac39ab2723818ebee443fa3c37df6fee2a50b5bfdc2b02660c87f7846"} Nov 25 15:55:48 crc kubenswrapper[4704]: I1125 15:55:48.054260 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=2.054238624 podStartE2EDuration="2.054238624s" podCreationTimestamp="2025-11-25 15:55:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:55:48.053673468 +0000 UTC m=+1234.321947259" watchObservedRunningTime="2025-11-25 15:55:48.054238624 +0000 UTC m=+1234.322512405" Nov 25 15:55:56 crc kubenswrapper[4704]: I1125 15:55:56.698753 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:56 crc kubenswrapper[4704]: I1125 15:55:56.699769 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:56 crc kubenswrapper[4704]: I1125 15:55:56.729655 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:56 crc kubenswrapper[4704]: I1125 15:55:56.737641 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:57 crc kubenswrapper[4704]: I1125 15:55:57.109429 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:57 crc kubenswrapper[4704]: I1125 15:55:57.109492 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:59 crc kubenswrapper[4704]: I1125 15:55:59.182064 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:55:59 crc kubenswrapper[4704]: I1125 15:55:59.182665 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:56:00 crc kubenswrapper[4704]: I1125 15:56:00.664276 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-8dmzk"] Nov 25 15:56:00 crc kubenswrapper[4704]: I1125 15:56:00.669508 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-8dmzk"] Nov 25 15:56:00 crc kubenswrapper[4704]: I1125 15:56:00.719897 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance3787-account-delete-h8vcg"] Nov 25 15:56:00 crc kubenswrapper[4704]: I1125 15:56:00.721420 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3787-account-delete-h8vcg" Nov 25 15:56:00 crc kubenswrapper[4704]: I1125 15:56:00.731879 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance3787-account-delete-h8vcg"] Nov 25 15:56:00 crc kubenswrapper[4704]: I1125 15:56:00.799002 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 25 15:56:00 crc kubenswrapper[4704]: I1125 15:56:00.807408 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66c4430d-a41b-4f2e-839f-8df89b1c68b9-operator-scripts\") pod \"glance3787-account-delete-h8vcg\" (UID: \"66c4430d-a41b-4f2e-839f-8df89b1c68b9\") " pod="glance-kuttl-tests/glance3787-account-delete-h8vcg" Nov 25 15:56:00 crc kubenswrapper[4704]: I1125 15:56:00.807848 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j674n\" (UniqueName: \"kubernetes.io/projected/66c4430d-a41b-4f2e-839f-8df89b1c68b9-kube-api-access-j674n\") pod \"glance3787-account-delete-h8vcg\" (UID: \"66c4430d-a41b-4f2e-839f-8df89b1c68b9\") " pod="glance-kuttl-tests/glance3787-account-delete-h8vcg" Nov 25 15:56:00 crc kubenswrapper[4704]: I1125 15:56:00.909028 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66c4430d-a41b-4f2e-839f-8df89b1c68b9-operator-scripts\") pod \"glance3787-account-delete-h8vcg\" (UID: \"66c4430d-a41b-4f2e-839f-8df89b1c68b9\") " pod="glance-kuttl-tests/glance3787-account-delete-h8vcg" Nov 25 15:56:00 crc kubenswrapper[4704]: I1125 15:56:00.909392 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j674n\" (UniqueName: \"kubernetes.io/projected/66c4430d-a41b-4f2e-839f-8df89b1c68b9-kube-api-access-j674n\") pod \"glance3787-account-delete-h8vcg\" (UID: \"66c4430d-a41b-4f2e-839f-8df89b1c68b9\") " pod="glance-kuttl-tests/glance3787-account-delete-h8vcg" Nov 25 15:56:00 crc kubenswrapper[4704]: I1125 15:56:00.910007 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66c4430d-a41b-4f2e-839f-8df89b1c68b9-operator-scripts\") pod \"glance3787-account-delete-h8vcg\" (UID: \"66c4430d-a41b-4f2e-839f-8df89b1c68b9\") " pod="glance-kuttl-tests/glance3787-account-delete-h8vcg" Nov 25 15:56:00 crc kubenswrapper[4704]: I1125 15:56:00.937731 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j674n\" (UniqueName: \"kubernetes.io/projected/66c4430d-a41b-4f2e-839f-8df89b1c68b9-kube-api-access-j674n\") pod \"glance3787-account-delete-h8vcg\" (UID: \"66c4430d-a41b-4f2e-839f-8df89b1c68b9\") " pod="glance-kuttl-tests/glance3787-account-delete-h8vcg" Nov 25 15:56:01 crc kubenswrapper[4704]: I1125 15:56:01.050686 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3787-account-delete-h8vcg" Nov 25 15:56:01 crc kubenswrapper[4704]: I1125 15:56:01.134632 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="ea901657-54a0-4000-8203-2b655bff8505" containerName="glance-log" containerID="cri-o://6eb10cbac39ab2723818ebee443fa3c37df6fee2a50b5bfdc2b02660c87f7846" gracePeriod=30 Nov 25 15:56:01 crc kubenswrapper[4704]: I1125 15:56:01.134831 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="ea901657-54a0-4000-8203-2b655bff8505" containerName="glance-httpd" containerID="cri-o://678aac0ae1fb12958716ee1f844bebe15d5d4ddf6265f88c7a7501bb1360d451" gracePeriod=30 Nov 25 15:56:01 crc kubenswrapper[4704]: I1125 15:56:01.140009 4704 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-single-0" podUID="ea901657-54a0-4000-8203-2b655bff8505" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.111:9292/healthcheck\": EOF" Nov 25 15:56:01 crc kubenswrapper[4704]: I1125 15:56:01.563199 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance3787-account-delete-h8vcg"] Nov 25 15:56:02 crc kubenswrapper[4704]: I1125 15:56:02.144126 4704 generic.go:334] "Generic (PLEG): container finished" podID="66c4430d-a41b-4f2e-839f-8df89b1c68b9" containerID="cdef7dd544dbfdb0345ab8b5776a2039078d0ed4d4b4b516b7e0e58f6ad92a4f" exitCode=0 Nov 25 15:56:02 crc kubenswrapper[4704]: I1125 15:56:02.144203 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance3787-account-delete-h8vcg" event={"ID":"66c4430d-a41b-4f2e-839f-8df89b1c68b9","Type":"ContainerDied","Data":"cdef7dd544dbfdb0345ab8b5776a2039078d0ed4d4b4b516b7e0e58f6ad92a4f"} Nov 25 15:56:02 crc kubenswrapper[4704]: I1125 15:56:02.144252 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance3787-account-delete-h8vcg" event={"ID":"66c4430d-a41b-4f2e-839f-8df89b1c68b9","Type":"ContainerStarted","Data":"7247a8a0475d5d5adc93cc96d75c0bae6c2176b54e247d1926d6e95d2c23cf82"} Nov 25 15:56:02 crc kubenswrapper[4704]: I1125 15:56:02.146761 4704 generic.go:334] "Generic (PLEG): container finished" podID="ea901657-54a0-4000-8203-2b655bff8505" containerID="6eb10cbac39ab2723818ebee443fa3c37df6fee2a50b5bfdc2b02660c87f7846" exitCode=143 Nov 25 15:56:02 crc kubenswrapper[4704]: I1125 15:56:02.146828 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"ea901657-54a0-4000-8203-2b655bff8505","Type":"ContainerDied","Data":"6eb10cbac39ab2723818ebee443fa3c37df6fee2a50b5bfdc2b02660c87f7846"} Nov 25 15:56:02 crc kubenswrapper[4704]: I1125 15:56:02.424779 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="488620be-e832-4e72-9a76-57ce869a39fc" path="/var/lib/kubelet/pods/488620be-e832-4e72-9a76-57ce869a39fc/volumes" Nov 25 15:56:03 crc kubenswrapper[4704]: I1125 15:56:03.436212 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3787-account-delete-h8vcg" Nov 25 15:56:03 crc kubenswrapper[4704]: I1125 15:56:03.548417 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66c4430d-a41b-4f2e-839f-8df89b1c68b9-operator-scripts\") pod \"66c4430d-a41b-4f2e-839f-8df89b1c68b9\" (UID: \"66c4430d-a41b-4f2e-839f-8df89b1c68b9\") " Nov 25 15:56:03 crc kubenswrapper[4704]: I1125 15:56:03.548527 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j674n\" (UniqueName: \"kubernetes.io/projected/66c4430d-a41b-4f2e-839f-8df89b1c68b9-kube-api-access-j674n\") pod \"66c4430d-a41b-4f2e-839f-8df89b1c68b9\" (UID: \"66c4430d-a41b-4f2e-839f-8df89b1c68b9\") " Nov 25 15:56:03 crc kubenswrapper[4704]: I1125 15:56:03.549597 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66c4430d-a41b-4f2e-839f-8df89b1c68b9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "66c4430d-a41b-4f2e-839f-8df89b1c68b9" (UID: "66c4430d-a41b-4f2e-839f-8df89b1c68b9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:56:03 crc kubenswrapper[4704]: I1125 15:56:03.556890 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66c4430d-a41b-4f2e-839f-8df89b1c68b9-kube-api-access-j674n" (OuterVolumeSpecName: "kube-api-access-j674n") pod "66c4430d-a41b-4f2e-839f-8df89b1c68b9" (UID: "66c4430d-a41b-4f2e-839f-8df89b1c68b9"). InnerVolumeSpecName "kube-api-access-j674n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:56:03 crc kubenswrapper[4704]: I1125 15:56:03.649884 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j674n\" (UniqueName: \"kubernetes.io/projected/66c4430d-a41b-4f2e-839f-8df89b1c68b9-kube-api-access-j674n\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:03 crc kubenswrapper[4704]: I1125 15:56:03.649928 4704 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66c4430d-a41b-4f2e-839f-8df89b1c68b9-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:04 crc kubenswrapper[4704]: I1125 15:56:04.163584 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance3787-account-delete-h8vcg" event={"ID":"66c4430d-a41b-4f2e-839f-8df89b1c68b9","Type":"ContainerDied","Data":"7247a8a0475d5d5adc93cc96d75c0bae6c2176b54e247d1926d6e95d2c23cf82"} Nov 25 15:56:04 crc kubenswrapper[4704]: I1125 15:56:04.163628 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7247a8a0475d5d5adc93cc96d75c0bae6c2176b54e247d1926d6e95d2c23cf82" Nov 25 15:56:04 crc kubenswrapper[4704]: I1125 15:56:04.163639 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3787-account-delete-h8vcg" Nov 25 15:56:04 crc kubenswrapper[4704]: I1125 15:56:04.870191 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:56:04 crc kubenswrapper[4704]: I1125 15:56:04.964435 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea901657-54a0-4000-8203-2b655bff8505-logs\") pod \"ea901657-54a0-4000-8203-2b655bff8505\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " Nov 25 15:56:04 crc kubenswrapper[4704]: I1125 15:56:04.964476 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-scripts\") pod \"ea901657-54a0-4000-8203-2b655bff8505\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " Nov 25 15:56:04 crc kubenswrapper[4704]: I1125 15:56:04.964511 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ea901657-54a0-4000-8203-2b655bff8505\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " Nov 25 15:56:04 crc kubenswrapper[4704]: I1125 15:56:04.964541 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-combined-ca-bundle\") pod \"ea901657-54a0-4000-8203-2b655bff8505\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " Nov 25 15:56:04 crc kubenswrapper[4704]: I1125 15:56:04.964581 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2bvh\" (UniqueName: \"kubernetes.io/projected/ea901657-54a0-4000-8203-2b655bff8505-kube-api-access-h2bvh\") pod \"ea901657-54a0-4000-8203-2b655bff8505\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " Nov 25 15:56:04 crc kubenswrapper[4704]: I1125 15:56:04.964670 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-config-data\") pod \"ea901657-54a0-4000-8203-2b655bff8505\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " Nov 25 15:56:04 crc kubenswrapper[4704]: I1125 15:56:04.964723 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-public-tls-certs\") pod \"ea901657-54a0-4000-8203-2b655bff8505\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " Nov 25 15:56:04 crc kubenswrapper[4704]: I1125 15:56:04.964746 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-internal-tls-certs\") pod \"ea901657-54a0-4000-8203-2b655bff8505\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " Nov 25 15:56:04 crc kubenswrapper[4704]: I1125 15:56:04.964775 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ea901657-54a0-4000-8203-2b655bff8505-httpd-run\") pod \"ea901657-54a0-4000-8203-2b655bff8505\" (UID: \"ea901657-54a0-4000-8203-2b655bff8505\") " Nov 25 15:56:04 crc kubenswrapper[4704]: I1125 15:56:04.965734 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea901657-54a0-4000-8203-2b655bff8505-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ea901657-54a0-4000-8203-2b655bff8505" (UID: "ea901657-54a0-4000-8203-2b655bff8505"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:56:04 crc kubenswrapper[4704]: I1125 15:56:04.965832 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea901657-54a0-4000-8203-2b655bff8505-logs" (OuterVolumeSpecName: "logs") pod "ea901657-54a0-4000-8203-2b655bff8505" (UID: "ea901657-54a0-4000-8203-2b655bff8505"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:56:04 crc kubenswrapper[4704]: I1125 15:56:04.971731 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "ea901657-54a0-4000-8203-2b655bff8505" (UID: "ea901657-54a0-4000-8203-2b655bff8505"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 15:56:04 crc kubenswrapper[4704]: I1125 15:56:04.972262 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-scripts" (OuterVolumeSpecName: "scripts") pod "ea901657-54a0-4000-8203-2b655bff8505" (UID: "ea901657-54a0-4000-8203-2b655bff8505"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:56:04 crc kubenswrapper[4704]: I1125 15:56:04.973959 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea901657-54a0-4000-8203-2b655bff8505-kube-api-access-h2bvh" (OuterVolumeSpecName: "kube-api-access-h2bvh") pod "ea901657-54a0-4000-8203-2b655bff8505" (UID: "ea901657-54a0-4000-8203-2b655bff8505"). InnerVolumeSpecName "kube-api-access-h2bvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:56:04 crc kubenswrapper[4704]: I1125 15:56:04.991441 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea901657-54a0-4000-8203-2b655bff8505" (UID: "ea901657-54a0-4000-8203-2b655bff8505"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.008585 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ea901657-54a0-4000-8203-2b655bff8505" (UID: "ea901657-54a0-4000-8203-2b655bff8505"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.010174 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ea901657-54a0-4000-8203-2b655bff8505" (UID: "ea901657-54a0-4000-8203-2b655bff8505"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.014425 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-config-data" (OuterVolumeSpecName: "config-data") pod "ea901657-54a0-4000-8203-2b655bff8505" (UID: "ea901657-54a0-4000-8203-2b655bff8505"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.067208 4704 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.067276 4704 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.067289 4704 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.067302 4704 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ea901657-54a0-4000-8203-2b655bff8505-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.067318 4704 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea901657-54a0-4000-8203-2b655bff8505-logs\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.067328 4704 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.067371 4704 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.067381 4704 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea901657-54a0-4000-8203-2b655bff8505-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.067392 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2bvh\" (UniqueName: \"kubernetes.io/projected/ea901657-54a0-4000-8203-2b655bff8505-kube-api-access-h2bvh\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.084692 4704 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.167979 4704 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.175300 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.175293 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"ea901657-54a0-4000-8203-2b655bff8505","Type":"ContainerDied","Data":"678aac0ae1fb12958716ee1f844bebe15d5d4ddf6265f88c7a7501bb1360d451"} Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.175399 4704 scope.go:117] "RemoveContainer" containerID="678aac0ae1fb12958716ee1f844bebe15d5d4ddf6265f88c7a7501bb1360d451" Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.175245 4704 generic.go:334] "Generic (PLEG): container finished" podID="ea901657-54a0-4000-8203-2b655bff8505" containerID="678aac0ae1fb12958716ee1f844bebe15d5d4ddf6265f88c7a7501bb1360d451" exitCode=0 Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.175655 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"ea901657-54a0-4000-8203-2b655bff8505","Type":"ContainerDied","Data":"1a353823fefd9d6544c7291d6be204fe3161348060aef4c75b7f9043f81bae4a"} Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.198981 4704 scope.go:117] "RemoveContainer" containerID="6eb10cbac39ab2723818ebee443fa3c37df6fee2a50b5bfdc2b02660c87f7846" Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.215981 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.222326 4704 scope.go:117] "RemoveContainer" containerID="678aac0ae1fb12958716ee1f844bebe15d5d4ddf6265f88c7a7501bb1360d451" Nov 25 15:56:05 crc kubenswrapper[4704]: E1125 15:56:05.223065 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"678aac0ae1fb12958716ee1f844bebe15d5d4ddf6265f88c7a7501bb1360d451\": container with ID starting with 678aac0ae1fb12958716ee1f844bebe15d5d4ddf6265f88c7a7501bb1360d451 not found: ID does not exist" containerID="678aac0ae1fb12958716ee1f844bebe15d5d4ddf6265f88c7a7501bb1360d451" Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.223169 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"678aac0ae1fb12958716ee1f844bebe15d5d4ddf6265f88c7a7501bb1360d451"} err="failed to get container status \"678aac0ae1fb12958716ee1f844bebe15d5d4ddf6265f88c7a7501bb1360d451\": rpc error: code = NotFound desc = could not find container \"678aac0ae1fb12958716ee1f844bebe15d5d4ddf6265f88c7a7501bb1360d451\": container with ID starting with 678aac0ae1fb12958716ee1f844bebe15d5d4ddf6265f88c7a7501bb1360d451 not found: ID does not exist" Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.223280 4704 scope.go:117] "RemoveContainer" containerID="6eb10cbac39ab2723818ebee443fa3c37df6fee2a50b5bfdc2b02660c87f7846" Nov 25 15:56:05 crc kubenswrapper[4704]: E1125 15:56:05.224702 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6eb10cbac39ab2723818ebee443fa3c37df6fee2a50b5bfdc2b02660c87f7846\": container with ID starting with 6eb10cbac39ab2723818ebee443fa3c37df6fee2a50b5bfdc2b02660c87f7846 not found: ID does not exist" containerID="6eb10cbac39ab2723818ebee443fa3c37df6fee2a50b5bfdc2b02660c87f7846" Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.224730 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eb10cbac39ab2723818ebee443fa3c37df6fee2a50b5bfdc2b02660c87f7846"} err="failed to get container status \"6eb10cbac39ab2723818ebee443fa3c37df6fee2a50b5bfdc2b02660c87f7846\": rpc error: code = NotFound desc = could not find container \"6eb10cbac39ab2723818ebee443fa3c37df6fee2a50b5bfdc2b02660c87f7846\": container with ID starting with 6eb10cbac39ab2723818ebee443fa3c37df6fee2a50b5bfdc2b02660c87f7846 not found: ID does not exist" Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.226036 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.741447 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-n9llf"] Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.746583 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-n9llf"] Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.752215 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance3787-account-delete-h8vcg"] Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.757497 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-3787-account-create-update-w6j7v"] Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.762430 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance3787-account-delete-h8vcg"] Nov 25 15:56:05 crc kubenswrapper[4704]: I1125 15:56:05.767993 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-3787-account-create-update-w6j7v"] Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.268263 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-jbcwh"] Nov 25 15:56:06 crc kubenswrapper[4704]: E1125 15:56:06.269140 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c4430d-a41b-4f2e-839f-8df89b1c68b9" containerName="mariadb-account-delete" Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.269158 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c4430d-a41b-4f2e-839f-8df89b1c68b9" containerName="mariadb-account-delete" Nov 25 15:56:06 crc kubenswrapper[4704]: E1125 15:56:06.269190 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea901657-54a0-4000-8203-2b655bff8505" containerName="glance-log" Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.269199 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea901657-54a0-4000-8203-2b655bff8505" containerName="glance-log" Nov 25 15:56:06 crc kubenswrapper[4704]: E1125 15:56:06.269214 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea901657-54a0-4000-8203-2b655bff8505" containerName="glance-httpd" Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.269221 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea901657-54a0-4000-8203-2b655bff8505" containerName="glance-httpd" Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.269352 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="66c4430d-a41b-4f2e-839f-8df89b1c68b9" containerName="mariadb-account-delete" Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.269370 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea901657-54a0-4000-8203-2b655bff8505" containerName="glance-httpd" Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.269384 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea901657-54a0-4000-8203-2b655bff8505" containerName="glance-log" Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.269995 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-jbcwh" Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.274277 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-90ee-account-create-update-mcgs5"] Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.275150 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-90ee-account-create-update-mcgs5" Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.277737 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.281665 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-jbcwh"] Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.288354 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-90ee-account-create-update-mcgs5"] Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.386017 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qr9p\" (UniqueName: \"kubernetes.io/projected/9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9-kube-api-access-5qr9p\") pod \"glance-db-create-jbcwh\" (UID: \"9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9\") " pod="glance-kuttl-tests/glance-db-create-jbcwh" Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.386138 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9-operator-scripts\") pod \"glance-db-create-jbcwh\" (UID: \"9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9\") " pod="glance-kuttl-tests/glance-db-create-jbcwh" Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.386188 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ae716d7-1612-4ba4-8860-733eca08736b-operator-scripts\") pod \"glance-90ee-account-create-update-mcgs5\" (UID: \"3ae716d7-1612-4ba4-8860-733eca08736b\") " pod="glance-kuttl-tests/glance-90ee-account-create-update-mcgs5" Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.386236 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n6k8\" (UniqueName: \"kubernetes.io/projected/3ae716d7-1612-4ba4-8860-733eca08736b-kube-api-access-7n6k8\") pod \"glance-90ee-account-create-update-mcgs5\" (UID: \"3ae716d7-1612-4ba4-8860-733eca08736b\") " pod="glance-kuttl-tests/glance-90ee-account-create-update-mcgs5" Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.424426 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66c4430d-a41b-4f2e-839f-8df89b1c68b9" path="/var/lib/kubelet/pods/66c4430d-a41b-4f2e-839f-8df89b1c68b9/volumes" Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.425081 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81ed81c2-d941-47cf-822b-f3e518a9cd4c" path="/var/lib/kubelet/pods/81ed81c2-d941-47cf-822b-f3e518a9cd4c/volumes" Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.425684 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb80a0aa-905e-4823-bf03-0880fce6d7f0" path="/var/lib/kubelet/pods/bb80a0aa-905e-4823-bf03-0880fce6d7f0/volumes" Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.427042 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea901657-54a0-4000-8203-2b655bff8505" path="/var/lib/kubelet/pods/ea901657-54a0-4000-8203-2b655bff8505/volumes" Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.487905 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qr9p\" (UniqueName: \"kubernetes.io/projected/9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9-kube-api-access-5qr9p\") pod \"glance-db-create-jbcwh\" (UID: \"9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9\") " pod="glance-kuttl-tests/glance-db-create-jbcwh" Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.487995 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9-operator-scripts\") pod \"glance-db-create-jbcwh\" (UID: \"9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9\") " pod="glance-kuttl-tests/glance-db-create-jbcwh" Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.488058 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ae716d7-1612-4ba4-8860-733eca08736b-operator-scripts\") pod \"glance-90ee-account-create-update-mcgs5\" (UID: \"3ae716d7-1612-4ba4-8860-733eca08736b\") " pod="glance-kuttl-tests/glance-90ee-account-create-update-mcgs5" Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.488083 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n6k8\" (UniqueName: \"kubernetes.io/projected/3ae716d7-1612-4ba4-8860-733eca08736b-kube-api-access-7n6k8\") pod \"glance-90ee-account-create-update-mcgs5\" (UID: \"3ae716d7-1612-4ba4-8860-733eca08736b\") " pod="glance-kuttl-tests/glance-90ee-account-create-update-mcgs5" Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.489317 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ae716d7-1612-4ba4-8860-733eca08736b-operator-scripts\") pod \"glance-90ee-account-create-update-mcgs5\" (UID: \"3ae716d7-1612-4ba4-8860-733eca08736b\") " pod="glance-kuttl-tests/glance-90ee-account-create-update-mcgs5" Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.489415 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9-operator-scripts\") pod \"glance-db-create-jbcwh\" (UID: \"9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9\") " pod="glance-kuttl-tests/glance-db-create-jbcwh" Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.508227 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n6k8\" (UniqueName: \"kubernetes.io/projected/3ae716d7-1612-4ba4-8860-733eca08736b-kube-api-access-7n6k8\") pod \"glance-90ee-account-create-update-mcgs5\" (UID: \"3ae716d7-1612-4ba4-8860-733eca08736b\") " pod="glance-kuttl-tests/glance-90ee-account-create-update-mcgs5" Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.508575 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qr9p\" (UniqueName: \"kubernetes.io/projected/9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9-kube-api-access-5qr9p\") pod \"glance-db-create-jbcwh\" (UID: \"9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9\") " pod="glance-kuttl-tests/glance-db-create-jbcwh" Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.601044 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-jbcwh" Nov 25 15:56:06 crc kubenswrapper[4704]: I1125 15:56:06.613393 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-90ee-account-create-update-mcgs5" Nov 25 15:56:07 crc kubenswrapper[4704]: I1125 15:56:07.035488 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-jbcwh"] Nov 25 15:56:07 crc kubenswrapper[4704]: I1125 15:56:07.084254 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-90ee-account-create-update-mcgs5"] Nov 25 15:56:07 crc kubenswrapper[4704]: W1125 15:56:07.087579 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ae716d7_1612_4ba4_8860_733eca08736b.slice/crio-04636db5fd6c3ade35c5b47d4749588cf479cd8c765f83e762f1f20f98301f12 WatchSource:0}: Error finding container 04636db5fd6c3ade35c5b47d4749588cf479cd8c765f83e762f1f20f98301f12: Status 404 returned error can't find the container with id 04636db5fd6c3ade35c5b47d4749588cf479cd8c765f83e762f1f20f98301f12 Nov 25 15:56:07 crc kubenswrapper[4704]: I1125 15:56:07.191716 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-jbcwh" event={"ID":"9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9","Type":"ContainerStarted","Data":"dee4d1218145033ebd0e3797727e2f3a831ac88a7305b733c1624107c1fe887f"} Nov 25 15:56:07 crc kubenswrapper[4704]: I1125 15:56:07.193270 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-90ee-account-create-update-mcgs5" event={"ID":"3ae716d7-1612-4ba4-8860-733eca08736b","Type":"ContainerStarted","Data":"04636db5fd6c3ade35c5b47d4749588cf479cd8c765f83e762f1f20f98301f12"} Nov 25 15:56:07 crc kubenswrapper[4704]: I1125 15:56:07.965210 4704 patch_prober.go:28] interesting pod/machine-config-daemon-djz8x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:56:07 crc kubenswrapper[4704]: I1125 15:56:07.965298 4704 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:56:08 crc kubenswrapper[4704]: I1125 15:56:08.203746 4704 generic.go:334] "Generic (PLEG): container finished" podID="3ae716d7-1612-4ba4-8860-733eca08736b" containerID="bde0e01a25eecbb150247fe41eb9cd801171db31f44920282ed771dea6a25965" exitCode=0 Nov 25 15:56:08 crc kubenswrapper[4704]: I1125 15:56:08.203864 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-90ee-account-create-update-mcgs5" event={"ID":"3ae716d7-1612-4ba4-8860-733eca08736b","Type":"ContainerDied","Data":"bde0e01a25eecbb150247fe41eb9cd801171db31f44920282ed771dea6a25965"} Nov 25 15:56:08 crc kubenswrapper[4704]: I1125 15:56:08.209273 4704 generic.go:334] "Generic (PLEG): container finished" podID="9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9" containerID="2c503d00362f7bb6ae694255b8afdfe642c071dcdb7d74aef84b1b9f77ede00c" exitCode=0 Nov 25 15:56:08 crc kubenswrapper[4704]: I1125 15:56:08.209332 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-jbcwh" event={"ID":"9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9","Type":"ContainerDied","Data":"2c503d00362f7bb6ae694255b8afdfe642c071dcdb7d74aef84b1b9f77ede00c"} Nov 25 15:56:09 crc kubenswrapper[4704]: I1125 15:56:09.541858 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-90ee-account-create-update-mcgs5" Nov 25 15:56:09 crc kubenswrapper[4704]: I1125 15:56:09.548734 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-jbcwh" Nov 25 15:56:09 crc kubenswrapper[4704]: I1125 15:56:09.635267 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9-operator-scripts\") pod \"9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9\" (UID: \"9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9\") " Nov 25 15:56:09 crc kubenswrapper[4704]: I1125 15:56:09.635316 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ae716d7-1612-4ba4-8860-733eca08736b-operator-scripts\") pod \"3ae716d7-1612-4ba4-8860-733eca08736b\" (UID: \"3ae716d7-1612-4ba4-8860-733eca08736b\") " Nov 25 15:56:09 crc kubenswrapper[4704]: I1125 15:56:09.635340 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n6k8\" (UniqueName: \"kubernetes.io/projected/3ae716d7-1612-4ba4-8860-733eca08736b-kube-api-access-7n6k8\") pod \"3ae716d7-1612-4ba4-8860-733eca08736b\" (UID: \"3ae716d7-1612-4ba4-8860-733eca08736b\") " Nov 25 15:56:09 crc kubenswrapper[4704]: I1125 15:56:09.635459 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qr9p\" (UniqueName: \"kubernetes.io/projected/9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9-kube-api-access-5qr9p\") pod \"9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9\" (UID: \"9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9\") " Nov 25 15:56:09 crc kubenswrapper[4704]: I1125 15:56:09.636479 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9" (UID: "9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:56:09 crc kubenswrapper[4704]: I1125 15:56:09.636477 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ae716d7-1612-4ba4-8860-733eca08736b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3ae716d7-1612-4ba4-8860-733eca08736b" (UID: "3ae716d7-1612-4ba4-8860-733eca08736b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 15:56:09 crc kubenswrapper[4704]: I1125 15:56:09.645348 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9-kube-api-access-5qr9p" (OuterVolumeSpecName: "kube-api-access-5qr9p") pod "9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9" (UID: "9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9"). InnerVolumeSpecName "kube-api-access-5qr9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:56:09 crc kubenswrapper[4704]: I1125 15:56:09.645491 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ae716d7-1612-4ba4-8860-733eca08736b-kube-api-access-7n6k8" (OuterVolumeSpecName: "kube-api-access-7n6k8") pod "3ae716d7-1612-4ba4-8860-733eca08736b" (UID: "3ae716d7-1612-4ba4-8860-733eca08736b"). InnerVolumeSpecName "kube-api-access-7n6k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:56:09 crc kubenswrapper[4704]: I1125 15:56:09.736949 4704 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:09 crc kubenswrapper[4704]: I1125 15:56:09.737014 4704 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ae716d7-1612-4ba4-8860-733eca08736b-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:09 crc kubenswrapper[4704]: I1125 15:56:09.737024 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n6k8\" (UniqueName: \"kubernetes.io/projected/3ae716d7-1612-4ba4-8860-733eca08736b-kube-api-access-7n6k8\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:09 crc kubenswrapper[4704]: I1125 15:56:09.737036 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qr9p\" (UniqueName: \"kubernetes.io/projected/9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9-kube-api-access-5qr9p\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:10 crc kubenswrapper[4704]: I1125 15:56:10.225927 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-90ee-account-create-update-mcgs5" Nov 25 15:56:10 crc kubenswrapper[4704]: I1125 15:56:10.225928 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-90ee-account-create-update-mcgs5" event={"ID":"3ae716d7-1612-4ba4-8860-733eca08736b","Type":"ContainerDied","Data":"04636db5fd6c3ade35c5b47d4749588cf479cd8c765f83e762f1f20f98301f12"} Nov 25 15:56:10 crc kubenswrapper[4704]: I1125 15:56:10.225993 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04636db5fd6c3ade35c5b47d4749588cf479cd8c765f83e762f1f20f98301f12" Nov 25 15:56:10 crc kubenswrapper[4704]: I1125 15:56:10.228129 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-jbcwh" event={"ID":"9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9","Type":"ContainerDied","Data":"dee4d1218145033ebd0e3797727e2f3a831ac88a7305b733c1624107c1fe887f"} Nov 25 15:56:10 crc kubenswrapper[4704]: I1125 15:56:10.228179 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dee4d1218145033ebd0e3797727e2f3a831ac88a7305b733c1624107c1fe887f" Nov 25 15:56:10 crc kubenswrapper[4704]: I1125 15:56:10.228144 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-jbcwh" Nov 25 15:56:11 crc kubenswrapper[4704]: I1125 15:56:11.567039 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-mcw5g"] Nov 25 15:56:11 crc kubenswrapper[4704]: E1125 15:56:11.567889 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae716d7-1612-4ba4-8860-733eca08736b" containerName="mariadb-account-create-update" Nov 25 15:56:11 crc kubenswrapper[4704]: I1125 15:56:11.567908 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae716d7-1612-4ba4-8860-733eca08736b" containerName="mariadb-account-create-update" Nov 25 15:56:11 crc kubenswrapper[4704]: E1125 15:56:11.567922 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9" containerName="mariadb-database-create" Nov 25 15:56:11 crc kubenswrapper[4704]: I1125 15:56:11.567929 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9" containerName="mariadb-database-create" Nov 25 15:56:11 crc kubenswrapper[4704]: I1125 15:56:11.568082 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9" containerName="mariadb-database-create" Nov 25 15:56:11 crc kubenswrapper[4704]: I1125 15:56:11.568094 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ae716d7-1612-4ba4-8860-733eca08736b" containerName="mariadb-account-create-update" Nov 25 15:56:11 crc kubenswrapper[4704]: I1125 15:56:11.568593 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-mcw5g" Nov 25 15:56:11 crc kubenswrapper[4704]: I1125 15:56:11.571169 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Nov 25 15:56:11 crc kubenswrapper[4704]: I1125 15:56:11.571639 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-w8pqr" Nov 25 15:56:11 crc kubenswrapper[4704]: I1125 15:56:11.576749 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-mcw5g"] Nov 25 15:56:11 crc kubenswrapper[4704]: I1125 15:56:11.661110 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcae4c02-2b13-40ea-b7d3-0dba85d8e793-config-data\") pod \"glance-db-sync-mcw5g\" (UID: \"dcae4c02-2b13-40ea-b7d3-0dba85d8e793\") " pod="glance-kuttl-tests/glance-db-sync-mcw5g" Nov 25 15:56:11 crc kubenswrapper[4704]: I1125 15:56:11.661242 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dcae4c02-2b13-40ea-b7d3-0dba85d8e793-db-sync-config-data\") pod \"glance-db-sync-mcw5g\" (UID: \"dcae4c02-2b13-40ea-b7d3-0dba85d8e793\") " pod="glance-kuttl-tests/glance-db-sync-mcw5g" Nov 25 15:56:11 crc kubenswrapper[4704]: I1125 15:56:11.661270 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld2w9\" (UniqueName: \"kubernetes.io/projected/dcae4c02-2b13-40ea-b7d3-0dba85d8e793-kube-api-access-ld2w9\") pod \"glance-db-sync-mcw5g\" (UID: \"dcae4c02-2b13-40ea-b7d3-0dba85d8e793\") " pod="glance-kuttl-tests/glance-db-sync-mcw5g" Nov 25 15:56:11 crc kubenswrapper[4704]: I1125 15:56:11.763078 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dcae4c02-2b13-40ea-b7d3-0dba85d8e793-db-sync-config-data\") pod \"glance-db-sync-mcw5g\" (UID: \"dcae4c02-2b13-40ea-b7d3-0dba85d8e793\") " pod="glance-kuttl-tests/glance-db-sync-mcw5g" Nov 25 15:56:11 crc kubenswrapper[4704]: I1125 15:56:11.763150 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld2w9\" (UniqueName: \"kubernetes.io/projected/dcae4c02-2b13-40ea-b7d3-0dba85d8e793-kube-api-access-ld2w9\") pod \"glance-db-sync-mcw5g\" (UID: \"dcae4c02-2b13-40ea-b7d3-0dba85d8e793\") " pod="glance-kuttl-tests/glance-db-sync-mcw5g" Nov 25 15:56:11 crc kubenswrapper[4704]: I1125 15:56:11.763182 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcae4c02-2b13-40ea-b7d3-0dba85d8e793-config-data\") pod \"glance-db-sync-mcw5g\" (UID: \"dcae4c02-2b13-40ea-b7d3-0dba85d8e793\") " pod="glance-kuttl-tests/glance-db-sync-mcw5g" Nov 25 15:56:11 crc kubenswrapper[4704]: I1125 15:56:11.769425 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcae4c02-2b13-40ea-b7d3-0dba85d8e793-config-data\") pod \"glance-db-sync-mcw5g\" (UID: \"dcae4c02-2b13-40ea-b7d3-0dba85d8e793\") " pod="glance-kuttl-tests/glance-db-sync-mcw5g" Nov 25 15:56:11 crc kubenswrapper[4704]: I1125 15:56:11.778451 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dcae4c02-2b13-40ea-b7d3-0dba85d8e793-db-sync-config-data\") pod \"glance-db-sync-mcw5g\" (UID: \"dcae4c02-2b13-40ea-b7d3-0dba85d8e793\") " pod="glance-kuttl-tests/glance-db-sync-mcw5g" Nov 25 15:56:11 crc kubenswrapper[4704]: I1125 15:56:11.784474 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld2w9\" (UniqueName: \"kubernetes.io/projected/dcae4c02-2b13-40ea-b7d3-0dba85d8e793-kube-api-access-ld2w9\") pod \"glance-db-sync-mcw5g\" (UID: \"dcae4c02-2b13-40ea-b7d3-0dba85d8e793\") " pod="glance-kuttl-tests/glance-db-sync-mcw5g" Nov 25 15:56:11 crc kubenswrapper[4704]: I1125 15:56:11.896145 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-mcw5g" Nov 25 15:56:12 crc kubenswrapper[4704]: I1125 15:56:12.169615 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-mcw5g"] Nov 25 15:56:12 crc kubenswrapper[4704]: I1125 15:56:12.242332 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-mcw5g" event={"ID":"dcae4c02-2b13-40ea-b7d3-0dba85d8e793","Type":"ContainerStarted","Data":"e9fa9ed14c58d6005068574a980a2a9d33da06fcf07865d04968a8338dc16070"} Nov 25 15:56:13 crc kubenswrapper[4704]: I1125 15:56:13.252299 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-mcw5g" event={"ID":"dcae4c02-2b13-40ea-b7d3-0dba85d8e793","Type":"ContainerStarted","Data":"53830a382248bd999649d537d07e2b1fdeacd6405b8aa625425ed45926b58384"} Nov 25 15:56:13 crc kubenswrapper[4704]: I1125 15:56:13.270225 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-mcw5g" podStartSLOduration=2.270209121 podStartE2EDuration="2.270209121s" podCreationTimestamp="2025-11-25 15:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:56:13.267206925 +0000 UTC m=+1259.535480706" watchObservedRunningTime="2025-11-25 15:56:13.270209121 +0000 UTC m=+1259.538482902" Nov 25 15:56:16 crc kubenswrapper[4704]: I1125 15:56:16.272486 4704 generic.go:334] "Generic (PLEG): container finished" podID="dcae4c02-2b13-40ea-b7d3-0dba85d8e793" containerID="53830a382248bd999649d537d07e2b1fdeacd6405b8aa625425ed45926b58384" exitCode=0 Nov 25 15:56:16 crc kubenswrapper[4704]: I1125 15:56:16.272602 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-mcw5g" event={"ID":"dcae4c02-2b13-40ea-b7d3-0dba85d8e793","Type":"ContainerDied","Data":"53830a382248bd999649d537d07e2b1fdeacd6405b8aa625425ed45926b58384"} Nov 25 15:56:17 crc kubenswrapper[4704]: I1125 15:56:17.542934 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-mcw5g" Nov 25 15:56:17 crc kubenswrapper[4704]: I1125 15:56:17.650257 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld2w9\" (UniqueName: \"kubernetes.io/projected/dcae4c02-2b13-40ea-b7d3-0dba85d8e793-kube-api-access-ld2w9\") pod \"dcae4c02-2b13-40ea-b7d3-0dba85d8e793\" (UID: \"dcae4c02-2b13-40ea-b7d3-0dba85d8e793\") " Nov 25 15:56:17 crc kubenswrapper[4704]: I1125 15:56:17.650331 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcae4c02-2b13-40ea-b7d3-0dba85d8e793-config-data\") pod \"dcae4c02-2b13-40ea-b7d3-0dba85d8e793\" (UID: \"dcae4c02-2b13-40ea-b7d3-0dba85d8e793\") " Nov 25 15:56:17 crc kubenswrapper[4704]: I1125 15:56:17.650384 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dcae4c02-2b13-40ea-b7d3-0dba85d8e793-db-sync-config-data\") pod \"dcae4c02-2b13-40ea-b7d3-0dba85d8e793\" (UID: \"dcae4c02-2b13-40ea-b7d3-0dba85d8e793\") " Nov 25 15:56:17 crc kubenswrapper[4704]: I1125 15:56:17.656542 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcae4c02-2b13-40ea-b7d3-0dba85d8e793-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "dcae4c02-2b13-40ea-b7d3-0dba85d8e793" (UID: "dcae4c02-2b13-40ea-b7d3-0dba85d8e793"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:56:17 crc kubenswrapper[4704]: I1125 15:56:17.657230 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcae4c02-2b13-40ea-b7d3-0dba85d8e793-kube-api-access-ld2w9" (OuterVolumeSpecName: "kube-api-access-ld2w9") pod "dcae4c02-2b13-40ea-b7d3-0dba85d8e793" (UID: "dcae4c02-2b13-40ea-b7d3-0dba85d8e793"). InnerVolumeSpecName "kube-api-access-ld2w9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:56:17 crc kubenswrapper[4704]: I1125 15:56:17.683902 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcae4c02-2b13-40ea-b7d3-0dba85d8e793-config-data" (OuterVolumeSpecName: "config-data") pod "dcae4c02-2b13-40ea-b7d3-0dba85d8e793" (UID: "dcae4c02-2b13-40ea-b7d3-0dba85d8e793"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:56:17 crc kubenswrapper[4704]: I1125 15:56:17.753450 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ld2w9\" (UniqueName: \"kubernetes.io/projected/dcae4c02-2b13-40ea-b7d3-0dba85d8e793-kube-api-access-ld2w9\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:17 crc kubenswrapper[4704]: I1125 15:56:17.753496 4704 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcae4c02-2b13-40ea-b7d3-0dba85d8e793-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:17 crc kubenswrapper[4704]: I1125 15:56:17.753505 4704 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dcae4c02-2b13-40ea-b7d3-0dba85d8e793-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:18 crc kubenswrapper[4704]: I1125 15:56:18.290633 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-mcw5g" event={"ID":"dcae4c02-2b13-40ea-b7d3-0dba85d8e793","Type":"ContainerDied","Data":"e9fa9ed14c58d6005068574a980a2a9d33da06fcf07865d04968a8338dc16070"} Nov 25 15:56:18 crc kubenswrapper[4704]: I1125 15:56:18.290680 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9fa9ed14c58d6005068574a980a2a9d33da06fcf07865d04968a8338dc16070" Nov 25 15:56:18 crc kubenswrapper[4704]: I1125 15:56:18.290738 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-mcw5g" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.362256 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 25 15:56:19 crc kubenswrapper[4704]: E1125 15:56:19.363672 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcae4c02-2b13-40ea-b7d3-0dba85d8e793" containerName="glance-db-sync" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.363755 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcae4c02-2b13-40ea-b7d3-0dba85d8e793" containerName="glance-db-sync" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.364009 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcae4c02-2b13-40ea-b7d3-0dba85d8e793" containerName="glance-db-sync" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.365275 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.367195 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.370376 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-w8pqr" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.371863 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-external-config-data" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.393456 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.476384 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.476443 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.476463 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.476494 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04d98209-2337-49da-a0ac-1f12810f5fb3-config-data\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.476526 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.476544 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-dev\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.476556 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04d98209-2337-49da-a0ac-1f12810f5fb3-logs\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.476581 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-run\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.476597 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fshqz\" (UniqueName: \"kubernetes.io/projected/04d98209-2337-49da-a0ac-1f12810f5fb3-kube-api-access-fshqz\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.476616 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.476686 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-sys\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.476702 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04d98209-2337-49da-a0ac-1f12810f5fb3-scripts\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.476722 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/04d98209-2337-49da-a0ac-1f12810f5fb3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.476740 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.578403 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-sys\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.578544 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04d98209-2337-49da-a0ac-1f12810f5fb3-scripts\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.578568 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/04d98209-2337-49da-a0ac-1f12810f5fb3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.578488 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-sys\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.579276 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/04d98209-2337-49da-a0ac-1f12810f5fb3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.579597 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.579770 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.579823 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.579850 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.579887 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04d98209-2337-49da-a0ac-1f12810f5fb3-config-data\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.579903 4704 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") device mount path \"/mnt/openstack/pv07\"" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.579894 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.579928 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.579990 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-dev\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.580009 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04d98209-2337-49da-a0ac-1f12810f5fb3-logs\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.580104 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-run\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.580130 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fshqz\" (UniqueName: \"kubernetes.io/projected/04d98209-2337-49da-a0ac-1f12810f5fb3-kube-api-access-fshqz\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.580168 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.580365 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.580396 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-run\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.580580 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-dev\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.580720 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.580725 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04d98209-2337-49da-a0ac-1f12810f5fb3-logs\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.580731 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.580894 4704 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.585892 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04d98209-2337-49da-a0ac-1f12810f5fb3-scripts\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.601943 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fshqz\" (UniqueName: \"kubernetes.io/projected/04d98209-2337-49da-a0ac-1f12810f5fb3-kube-api-access-fshqz\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.613064 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.618361 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04d98209-2337-49da-a0ac-1f12810f5fb3-config-data\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.638549 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.682989 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.687281 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.689141 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.696800 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.720375 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.782652 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.782719 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c661889-976d-44fc-a281-7c8a906f1b52-logs\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.782746 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-run\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.782772 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c661889-976d-44fc-a281-7c8a906f1b52-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.782845 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c661889-976d-44fc-a281-7c8a906f1b52-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.782882 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-dev\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.782934 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.783003 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c661889-976d-44fc-a281-7c8a906f1b52-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.783068 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.783088 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.783204 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-sys\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.783241 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.783261 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.783295 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdrzq\" (UniqueName: \"kubernetes.io/projected/1c661889-976d-44fc-a281-7c8a906f1b52-kube-api-access-vdrzq\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.885206 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.885258 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.885322 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-sys\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.885347 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.885372 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.885398 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdrzq\" (UniqueName: \"kubernetes.io/projected/1c661889-976d-44fc-a281-7c8a906f1b52-kube-api-access-vdrzq\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.885426 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.885446 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c661889-976d-44fc-a281-7c8a906f1b52-logs\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.885461 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-run\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.885485 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c661889-976d-44fc-a281-7c8a906f1b52-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.885510 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c661889-976d-44fc-a281-7c8a906f1b52-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.885538 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.885553 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-dev\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.885568 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c661889-976d-44fc-a281-7c8a906f1b52-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.885681 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.885815 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.886187 4704 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") device mount path \"/mnt/openstack/pv10\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.886488 4704 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.887954 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.888046 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.888068 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-run\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.888099 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-sys\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.888127 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-dev\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.888986 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c661889-976d-44fc-a281-7c8a906f1b52-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.889241 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c661889-976d-44fc-a281-7c8a906f1b52-logs\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.893498 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c661889-976d-44fc-a281-7c8a906f1b52-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.896911 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c661889-976d-44fc-a281-7c8a906f1b52-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.906361 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdrzq\" (UniqueName: \"kubernetes.io/projected/1c661889-976d-44fc-a281-7c8a906f1b52-kube-api-access-vdrzq\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.911395 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:19 crc kubenswrapper[4704]: I1125 15:56:19.915874 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:20 crc kubenswrapper[4704]: I1125 15:56:20.068854 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:20 crc kubenswrapper[4704]: I1125 15:56:20.168881 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 25 15:56:20 crc kubenswrapper[4704]: I1125 15:56:20.316551 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"04d98209-2337-49da-a0ac-1f12810f5fb3","Type":"ContainerStarted","Data":"24b0f1a6b39f617b8a84cee86f3a8cd8277b492d6a08e87116a959e392a25d6a"} Nov 25 15:56:20 crc kubenswrapper[4704]: I1125 15:56:20.399698 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 25 15:56:20 crc kubenswrapper[4704]: W1125 15:56:20.422642 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c661889_976d_44fc_a281_7c8a906f1b52.slice/crio-0ae6ba0519c34586070ea69b80b93e3c4917c70b480b02045b71664f8bc4cc1e WatchSource:0}: Error finding container 0ae6ba0519c34586070ea69b80b93e3c4917c70b480b02045b71664f8bc4cc1e: Status 404 returned error can't find the container with id 0ae6ba0519c34586070ea69b80b93e3c4917c70b480b02045b71664f8bc4cc1e Nov 25 15:56:20 crc kubenswrapper[4704]: I1125 15:56:20.431021 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.324310 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"1c661889-976d-44fc-a281-7c8a906f1b52","Type":"ContainerStarted","Data":"483719b9ace1803adc7a5e3528a03d56df1e85cc5b75d19210011104229e688a"} Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.325199 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"1c661889-976d-44fc-a281-7c8a906f1b52","Type":"ContainerStarted","Data":"aa0724a5b725e8ed7b61c040b1b4fb20be009faaaae872623e7d259531807624"} Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.325214 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"1c661889-976d-44fc-a281-7c8a906f1b52","Type":"ContainerStarted","Data":"f4787851b281f0e70f9720034c47414384d9b581c066361b10610cbc2b95da17"} Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.324418 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="1c661889-976d-44fc-a281-7c8a906f1b52" containerName="glance-log" containerID="cri-o://f4787851b281f0e70f9720034c47414384d9b581c066361b10610cbc2b95da17" gracePeriod=30 Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.325224 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"1c661889-976d-44fc-a281-7c8a906f1b52","Type":"ContainerStarted","Data":"0ae6ba0519c34586070ea69b80b93e3c4917c70b480b02045b71664f8bc4cc1e"} Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.324479 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="1c661889-976d-44fc-a281-7c8a906f1b52" containerName="glance-httpd" containerID="cri-o://aa0724a5b725e8ed7b61c040b1b4fb20be009faaaae872623e7d259531807624" gracePeriod=30 Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.324460 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="1c661889-976d-44fc-a281-7c8a906f1b52" containerName="glance-api" containerID="cri-o://483719b9ace1803adc7a5e3528a03d56df1e85cc5b75d19210011104229e688a" gracePeriod=30 Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.327393 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"04d98209-2337-49da-a0ac-1f12810f5fb3","Type":"ContainerStarted","Data":"23d64f60f78a194e9f0b84c9ff9aabfe6cdaf860a0c77e90cfbd93a1690077e8"} Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.327426 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"04d98209-2337-49da-a0ac-1f12810f5fb3","Type":"ContainerStarted","Data":"afd5aae978df19dba491f861e9b57fa98b42f5fb2cec66340ac2741bc94a4b7b"} Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.327437 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"04d98209-2337-49da-a0ac-1f12810f5fb3","Type":"ContainerStarted","Data":"980a3e14be2bfb287117cc886bb023ebabe2cf0486b5beb817984b29a900b78e"} Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.359528 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.359512427 podStartE2EDuration="3.359512427s" podCreationTimestamp="2025-11-25 15:56:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:56:21.352206977 +0000 UTC m=+1267.620480778" watchObservedRunningTime="2025-11-25 15:56:21.359512427 +0000 UTC m=+1267.627786208" Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.386376 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.38634606 podStartE2EDuration="2.38634606s" podCreationTimestamp="2025-11-25 15:56:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:56:21.380676727 +0000 UTC m=+1267.648950508" watchObservedRunningTime="2025-11-25 15:56:21.38634606 +0000 UTC m=+1267.654619841" Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.768172 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.915489 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdrzq\" (UniqueName: \"kubernetes.io/projected/1c661889-976d-44fc-a281-7c8a906f1b52-kube-api-access-vdrzq\") pod \"1c661889-976d-44fc-a281-7c8a906f1b52\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.916757 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-etc-iscsi\") pod \"1c661889-976d-44fc-a281-7c8a906f1b52\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.916812 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"1c661889-976d-44fc-a281-7c8a906f1b52\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.916874 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-sys\") pod \"1c661889-976d-44fc-a281-7c8a906f1b52\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.916897 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"1c661889-976d-44fc-a281-7c8a906f1b52\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.916918 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-etc-nvme\") pod \"1c661889-976d-44fc-a281-7c8a906f1b52\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.916959 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c661889-976d-44fc-a281-7c8a906f1b52-logs\") pod \"1c661889-976d-44fc-a281-7c8a906f1b52\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.916982 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c661889-976d-44fc-a281-7c8a906f1b52-config-data\") pod \"1c661889-976d-44fc-a281-7c8a906f1b52\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.917049 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-lib-modules\") pod \"1c661889-976d-44fc-a281-7c8a906f1b52\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.917098 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c661889-976d-44fc-a281-7c8a906f1b52-scripts\") pod \"1c661889-976d-44fc-a281-7c8a906f1b52\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.917131 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c661889-976d-44fc-a281-7c8a906f1b52-httpd-run\") pod \"1c661889-976d-44fc-a281-7c8a906f1b52\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.917194 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-run\") pod \"1c661889-976d-44fc-a281-7c8a906f1b52\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.917218 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-dev\") pod \"1c661889-976d-44fc-a281-7c8a906f1b52\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.917250 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-var-locks-brick\") pod \"1c661889-976d-44fc-a281-7c8a906f1b52\" (UID: \"1c661889-976d-44fc-a281-7c8a906f1b52\") " Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.917656 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "1c661889-976d-44fc-a281-7c8a906f1b52" (UID: "1c661889-976d-44fc-a281-7c8a906f1b52"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.917697 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "1c661889-976d-44fc-a281-7c8a906f1b52" (UID: "1c661889-976d-44fc-a281-7c8a906f1b52"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.918291 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-run" (OuterVolumeSpecName: "run") pod "1c661889-976d-44fc-a281-7c8a906f1b52" (UID: "1c661889-976d-44fc-a281-7c8a906f1b52"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.918334 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-sys" (OuterVolumeSpecName: "sys") pod "1c661889-976d-44fc-a281-7c8a906f1b52" (UID: "1c661889-976d-44fc-a281-7c8a906f1b52"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.918379 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c661889-976d-44fc-a281-7c8a906f1b52-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1c661889-976d-44fc-a281-7c8a906f1b52" (UID: "1c661889-976d-44fc-a281-7c8a906f1b52"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.918443 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "1c661889-976d-44fc-a281-7c8a906f1b52" (UID: "1c661889-976d-44fc-a281-7c8a906f1b52"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.918480 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-dev" (OuterVolumeSpecName: "dev") pod "1c661889-976d-44fc-a281-7c8a906f1b52" (UID: "1c661889-976d-44fc-a281-7c8a906f1b52"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.918528 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "1c661889-976d-44fc-a281-7c8a906f1b52" (UID: "1c661889-976d-44fc-a281-7c8a906f1b52"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.918578 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c661889-976d-44fc-a281-7c8a906f1b52-logs" (OuterVolumeSpecName: "logs") pod "1c661889-976d-44fc-a281-7c8a906f1b52" (UID: "1c661889-976d-44fc-a281-7c8a906f1b52"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.925091 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance-cache") pod "1c661889-976d-44fc-a281-7c8a906f1b52" (UID: "1c661889-976d-44fc-a281-7c8a906f1b52"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.925219 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "1c661889-976d-44fc-a281-7c8a906f1b52" (UID: "1c661889-976d-44fc-a281-7c8a906f1b52"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.925664 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c661889-976d-44fc-a281-7c8a906f1b52-kube-api-access-vdrzq" (OuterVolumeSpecName: "kube-api-access-vdrzq") pod "1c661889-976d-44fc-a281-7c8a906f1b52" (UID: "1c661889-976d-44fc-a281-7c8a906f1b52"). InnerVolumeSpecName "kube-api-access-vdrzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:56:21 crc kubenswrapper[4704]: I1125 15:56:21.925992 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c661889-976d-44fc-a281-7c8a906f1b52-scripts" (OuterVolumeSpecName: "scripts") pod "1c661889-976d-44fc-a281-7c8a906f1b52" (UID: "1c661889-976d-44fc-a281-7c8a906f1b52"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.006979 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c661889-976d-44fc-a281-7c8a906f1b52-config-data" (OuterVolumeSpecName: "config-data") pod "1c661889-976d-44fc-a281-7c8a906f1b52" (UID: "1c661889-976d-44fc-a281-7c8a906f1b52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.018549 4704 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-run\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.018582 4704 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-dev\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.018593 4704 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.018603 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdrzq\" (UniqueName: \"kubernetes.io/projected/1c661889-976d-44fc-a281-7c8a906f1b52-kube-api-access-vdrzq\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.018612 4704 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.018643 4704 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.018654 4704 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-sys\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.018667 4704 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.018677 4704 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.018686 4704 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c661889-976d-44fc-a281-7c8a906f1b52-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.018694 4704 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c661889-976d-44fc-a281-7c8a906f1b52-logs\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.018703 4704 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c661889-976d-44fc-a281-7c8a906f1b52-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.018711 4704 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c661889-976d-44fc-a281-7c8a906f1b52-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.018719 4704 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c661889-976d-44fc-a281-7c8a906f1b52-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.033809 4704 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.036634 4704 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.120380 4704 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.120653 4704 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.335835 4704 generic.go:334] "Generic (PLEG): container finished" podID="1c661889-976d-44fc-a281-7c8a906f1b52" containerID="483719b9ace1803adc7a5e3528a03d56df1e85cc5b75d19210011104229e688a" exitCode=143 Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.336335 4704 generic.go:334] "Generic (PLEG): container finished" podID="1c661889-976d-44fc-a281-7c8a906f1b52" containerID="aa0724a5b725e8ed7b61c040b1b4fb20be009faaaae872623e7d259531807624" exitCode=143 Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.336348 4704 generic.go:334] "Generic (PLEG): container finished" podID="1c661889-976d-44fc-a281-7c8a906f1b52" containerID="f4787851b281f0e70f9720034c47414384d9b581c066361b10610cbc2b95da17" exitCode=143 Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.335941 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"1c661889-976d-44fc-a281-7c8a906f1b52","Type":"ContainerDied","Data":"483719b9ace1803adc7a5e3528a03d56df1e85cc5b75d19210011104229e688a"} Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.335958 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.336408 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"1c661889-976d-44fc-a281-7c8a906f1b52","Type":"ContainerDied","Data":"aa0724a5b725e8ed7b61c040b1b4fb20be009faaaae872623e7d259531807624"} Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.336438 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"1c661889-976d-44fc-a281-7c8a906f1b52","Type":"ContainerDied","Data":"f4787851b281f0e70f9720034c47414384d9b581c066361b10610cbc2b95da17"} Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.336450 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"1c661889-976d-44fc-a281-7c8a906f1b52","Type":"ContainerDied","Data":"0ae6ba0519c34586070ea69b80b93e3c4917c70b480b02045b71664f8bc4cc1e"} Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.336465 4704 scope.go:117] "RemoveContainer" containerID="483719b9ace1803adc7a5e3528a03d56df1e85cc5b75d19210011104229e688a" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.366114 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.377106 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.378543 4704 scope.go:117] "RemoveContainer" containerID="aa0724a5b725e8ed7b61c040b1b4fb20be009faaaae872623e7d259531807624" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.397946 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 25 15:56:22 crc kubenswrapper[4704]: E1125 15:56:22.398243 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c661889-976d-44fc-a281-7c8a906f1b52" containerName="glance-log" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.398258 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c661889-976d-44fc-a281-7c8a906f1b52" containerName="glance-log" Nov 25 15:56:22 crc kubenswrapper[4704]: E1125 15:56:22.398279 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c661889-976d-44fc-a281-7c8a906f1b52" containerName="glance-api" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.398286 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c661889-976d-44fc-a281-7c8a906f1b52" containerName="glance-api" Nov 25 15:56:22 crc kubenswrapper[4704]: E1125 15:56:22.398299 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c661889-976d-44fc-a281-7c8a906f1b52" containerName="glance-httpd" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.398304 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c661889-976d-44fc-a281-7c8a906f1b52" containerName="glance-httpd" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.398429 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c661889-976d-44fc-a281-7c8a906f1b52" containerName="glance-log" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.398447 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c661889-976d-44fc-a281-7c8a906f1b52" containerName="glance-httpd" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.398459 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c661889-976d-44fc-a281-7c8a906f1b52" containerName="glance-api" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.399405 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.404676 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.405856 4704 scope.go:117] "RemoveContainer" containerID="f4787851b281f0e70f9720034c47414384d9b581c066361b10610cbc2b95da17" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.410396 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.425102 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c661889-976d-44fc-a281-7c8a906f1b52" path="/var/lib/kubelet/pods/1c661889-976d-44fc-a281-7c8a906f1b52/volumes" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.441043 4704 scope.go:117] "RemoveContainer" containerID="483719b9ace1803adc7a5e3528a03d56df1e85cc5b75d19210011104229e688a" Nov 25 15:56:22 crc kubenswrapper[4704]: E1125 15:56:22.441897 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"483719b9ace1803adc7a5e3528a03d56df1e85cc5b75d19210011104229e688a\": container with ID starting with 483719b9ace1803adc7a5e3528a03d56df1e85cc5b75d19210011104229e688a not found: ID does not exist" containerID="483719b9ace1803adc7a5e3528a03d56df1e85cc5b75d19210011104229e688a" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.441956 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"483719b9ace1803adc7a5e3528a03d56df1e85cc5b75d19210011104229e688a"} err="failed to get container status \"483719b9ace1803adc7a5e3528a03d56df1e85cc5b75d19210011104229e688a\": rpc error: code = NotFound desc = could not find container \"483719b9ace1803adc7a5e3528a03d56df1e85cc5b75d19210011104229e688a\": container with ID starting with 483719b9ace1803adc7a5e3528a03d56df1e85cc5b75d19210011104229e688a not found: ID does not exist" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.441991 4704 scope.go:117] "RemoveContainer" containerID="aa0724a5b725e8ed7b61c040b1b4fb20be009faaaae872623e7d259531807624" Nov 25 15:56:22 crc kubenswrapper[4704]: E1125 15:56:22.443242 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa0724a5b725e8ed7b61c040b1b4fb20be009faaaae872623e7d259531807624\": container with ID starting with aa0724a5b725e8ed7b61c040b1b4fb20be009faaaae872623e7d259531807624 not found: ID does not exist" containerID="aa0724a5b725e8ed7b61c040b1b4fb20be009faaaae872623e7d259531807624" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.443306 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa0724a5b725e8ed7b61c040b1b4fb20be009faaaae872623e7d259531807624"} err="failed to get container status \"aa0724a5b725e8ed7b61c040b1b4fb20be009faaaae872623e7d259531807624\": rpc error: code = NotFound desc = could not find container \"aa0724a5b725e8ed7b61c040b1b4fb20be009faaaae872623e7d259531807624\": container with ID starting with aa0724a5b725e8ed7b61c040b1b4fb20be009faaaae872623e7d259531807624 not found: ID does not exist" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.443344 4704 scope.go:117] "RemoveContainer" containerID="f4787851b281f0e70f9720034c47414384d9b581c066361b10610cbc2b95da17" Nov 25 15:56:22 crc kubenswrapper[4704]: E1125 15:56:22.443668 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4787851b281f0e70f9720034c47414384d9b581c066361b10610cbc2b95da17\": container with ID starting with f4787851b281f0e70f9720034c47414384d9b581c066361b10610cbc2b95da17 not found: ID does not exist" containerID="f4787851b281f0e70f9720034c47414384d9b581c066361b10610cbc2b95da17" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.443761 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4787851b281f0e70f9720034c47414384d9b581c066361b10610cbc2b95da17"} err="failed to get container status \"f4787851b281f0e70f9720034c47414384d9b581c066361b10610cbc2b95da17\": rpc error: code = NotFound desc = could not find container \"f4787851b281f0e70f9720034c47414384d9b581c066361b10610cbc2b95da17\": container with ID starting with f4787851b281f0e70f9720034c47414384d9b581c066361b10610cbc2b95da17 not found: ID does not exist" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.443850 4704 scope.go:117] "RemoveContainer" containerID="483719b9ace1803adc7a5e3528a03d56df1e85cc5b75d19210011104229e688a" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.445665 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"483719b9ace1803adc7a5e3528a03d56df1e85cc5b75d19210011104229e688a"} err="failed to get container status \"483719b9ace1803adc7a5e3528a03d56df1e85cc5b75d19210011104229e688a\": rpc error: code = NotFound desc = could not find container \"483719b9ace1803adc7a5e3528a03d56df1e85cc5b75d19210011104229e688a\": container with ID starting with 483719b9ace1803adc7a5e3528a03d56df1e85cc5b75d19210011104229e688a not found: ID does not exist" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.445711 4704 scope.go:117] "RemoveContainer" containerID="aa0724a5b725e8ed7b61c040b1b4fb20be009faaaae872623e7d259531807624" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.446184 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa0724a5b725e8ed7b61c040b1b4fb20be009faaaae872623e7d259531807624"} err="failed to get container status \"aa0724a5b725e8ed7b61c040b1b4fb20be009faaaae872623e7d259531807624\": rpc error: code = NotFound desc = could not find container \"aa0724a5b725e8ed7b61c040b1b4fb20be009faaaae872623e7d259531807624\": container with ID starting with aa0724a5b725e8ed7b61c040b1b4fb20be009faaaae872623e7d259531807624 not found: ID does not exist" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.446221 4704 scope.go:117] "RemoveContainer" containerID="f4787851b281f0e70f9720034c47414384d9b581c066361b10610cbc2b95da17" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.446943 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4787851b281f0e70f9720034c47414384d9b581c066361b10610cbc2b95da17"} err="failed to get container status \"f4787851b281f0e70f9720034c47414384d9b581c066361b10610cbc2b95da17\": rpc error: code = NotFound desc = could not find container \"f4787851b281f0e70f9720034c47414384d9b581c066361b10610cbc2b95da17\": container with ID starting with f4787851b281f0e70f9720034c47414384d9b581c066361b10610cbc2b95da17 not found: ID does not exist" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.446965 4704 scope.go:117] "RemoveContainer" containerID="483719b9ace1803adc7a5e3528a03d56df1e85cc5b75d19210011104229e688a" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.447194 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"483719b9ace1803adc7a5e3528a03d56df1e85cc5b75d19210011104229e688a"} err="failed to get container status \"483719b9ace1803adc7a5e3528a03d56df1e85cc5b75d19210011104229e688a\": rpc error: code = NotFound desc = could not find container \"483719b9ace1803adc7a5e3528a03d56df1e85cc5b75d19210011104229e688a\": container with ID starting with 483719b9ace1803adc7a5e3528a03d56df1e85cc5b75d19210011104229e688a not found: ID does not exist" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.447235 4704 scope.go:117] "RemoveContainer" containerID="aa0724a5b725e8ed7b61c040b1b4fb20be009faaaae872623e7d259531807624" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.447487 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa0724a5b725e8ed7b61c040b1b4fb20be009faaaae872623e7d259531807624"} err="failed to get container status \"aa0724a5b725e8ed7b61c040b1b4fb20be009faaaae872623e7d259531807624\": rpc error: code = NotFound desc = could not find container \"aa0724a5b725e8ed7b61c040b1b4fb20be009faaaae872623e7d259531807624\": container with ID starting with aa0724a5b725e8ed7b61c040b1b4fb20be009faaaae872623e7d259531807624 not found: ID does not exist" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.447506 4704 scope.go:117] "RemoveContainer" containerID="f4787851b281f0e70f9720034c47414384d9b581c066361b10610cbc2b95da17" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.447721 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4787851b281f0e70f9720034c47414384d9b581c066361b10610cbc2b95da17"} err="failed to get container status \"f4787851b281f0e70f9720034c47414384d9b581c066361b10610cbc2b95da17\": rpc error: code = NotFound desc = could not find container \"f4787851b281f0e70f9720034c47414384d9b581c066361b10610cbc2b95da17\": container with ID starting with f4787851b281f0e70f9720034c47414384d9b581c066361b10610cbc2b95da17 not found: ID does not exist" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.525567 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-dev\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.525639 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4e3b666-6607-432c-9274-a75ba8716911-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.525710 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-sys\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.526587 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4e3b666-6607-432c-9274-a75ba8716911-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.526630 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.526679 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4e3b666-6607-432c-9274-a75ba8716911-logs\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.526703 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.526721 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-run\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.526746 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.526767 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d4e3b666-6607-432c-9274-a75ba8716911-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.526809 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.526846 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dccfr\" (UniqueName: \"kubernetes.io/projected/d4e3b666-6607-432c-9274-a75ba8716911-kube-api-access-dccfr\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.526872 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.526891 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.627753 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-sys\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.627830 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4e3b666-6607-432c-9274-a75ba8716911-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.627860 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.627874 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-sys\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.627888 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.627903 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4e3b666-6607-432c-9274-a75ba8716911-logs\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.627919 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-run\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.627939 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.627960 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d4e3b666-6607-432c-9274-a75ba8716911-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.627978 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.628010 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.627980 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.628044 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dccfr\" (UniqueName: \"kubernetes.io/projected/d4e3b666-6607-432c-9274-a75ba8716911-kube-api-access-dccfr\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.628060 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.628078 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.628114 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-dev\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.628144 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4e3b666-6607-432c-9274-a75ba8716911-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.628150 4704 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") device mount path \"/mnt/openstack/pv10\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.628184 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.628490 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.628496 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-dev\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.628516 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4e3b666-6607-432c-9274-a75ba8716911-logs\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.628558 4704 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.629160 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d4e3b666-6607-432c-9274-a75ba8716911-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.629223 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-run\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.635531 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4e3b666-6607-432c-9274-a75ba8716911-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.635713 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4e3b666-6607-432c-9274-a75ba8716911-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.650856 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dccfr\" (UniqueName: \"kubernetes.io/projected/d4e3b666-6607-432c-9274-a75ba8716911-kube-api-access-dccfr\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.652552 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.654423 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:22 crc kubenswrapper[4704]: I1125 15:56:22.723070 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:23 crc kubenswrapper[4704]: I1125 15:56:23.149567 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 25 15:56:23 crc kubenswrapper[4704]: I1125 15:56:23.355418 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"d4e3b666-6607-432c-9274-a75ba8716911","Type":"ContainerStarted","Data":"f87011cb44ef6ac465628839c9297b4a26dfe8381366f2b0d1bcbac539cc8299"} Nov 25 15:56:23 crc kubenswrapper[4704]: I1125 15:56:23.355964 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"d4e3b666-6607-432c-9274-a75ba8716911","Type":"ContainerStarted","Data":"27c0de1a36d2a8d290d00de42aa3fc8959ffa876c9c4181c69898b0da75c1d52"} Nov 25 15:56:24 crc kubenswrapper[4704]: I1125 15:56:24.365896 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"d4e3b666-6607-432c-9274-a75ba8716911","Type":"ContainerStarted","Data":"338f08b9726471fdea676b2cb113909d8cd04f6e543de7b22ccb05d0a0ee1f1f"} Nov 25 15:56:24 crc kubenswrapper[4704]: I1125 15:56:24.366727 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"d4e3b666-6607-432c-9274-a75ba8716911","Type":"ContainerStarted","Data":"82cf845e3515098866a1241ef9bceeea60278a66c29144abeda36839f973d258"} Nov 25 15:56:29 crc kubenswrapper[4704]: I1125 15:56:29.683276 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:29 crc kubenswrapper[4704]: I1125 15:56:29.683874 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:29 crc kubenswrapper[4704]: I1125 15:56:29.683889 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:29 crc kubenswrapper[4704]: I1125 15:56:29.708950 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:29 crc kubenswrapper[4704]: I1125 15:56:29.709565 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:29 crc kubenswrapper[4704]: I1125 15:56:29.722343 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:29 crc kubenswrapper[4704]: I1125 15:56:29.738227 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=7.738202186 podStartE2EDuration="7.738202186s" podCreationTimestamp="2025-11-25 15:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:56:24.404675073 +0000 UTC m=+1270.672948854" watchObservedRunningTime="2025-11-25 15:56:29.738202186 +0000 UTC m=+1276.006475967" Nov 25 15:56:30 crc kubenswrapper[4704]: I1125 15:56:30.407893 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:30 crc kubenswrapper[4704]: I1125 15:56:30.408372 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:30 crc kubenswrapper[4704]: I1125 15:56:30.408383 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:30 crc kubenswrapper[4704]: I1125 15:56:30.425446 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:30 crc kubenswrapper[4704]: I1125 15:56:30.425498 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:30 crc kubenswrapper[4704]: I1125 15:56:30.425517 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 15:56:32 crc kubenswrapper[4704]: I1125 15:56:32.724862 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:32 crc kubenswrapper[4704]: I1125 15:56:32.726585 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:32 crc kubenswrapper[4704]: I1125 15:56:32.726741 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:32 crc kubenswrapper[4704]: I1125 15:56:32.752912 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:32 crc kubenswrapper[4704]: I1125 15:56:32.753561 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:32 crc kubenswrapper[4704]: I1125 15:56:32.769009 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:33 crc kubenswrapper[4704]: I1125 15:56:33.429844 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:33 crc kubenswrapper[4704]: I1125 15:56:33.429914 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:33 crc kubenswrapper[4704]: I1125 15:56:33.429926 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:33 crc kubenswrapper[4704]: I1125 15:56:33.443333 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:33 crc kubenswrapper[4704]: I1125 15:56:33.444199 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:33 crc kubenswrapper[4704]: I1125 15:56:33.448244 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.367461 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.370285 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.374605 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.376341 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.388171 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.399113 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.416444 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-sys\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.416491 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.416515 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.416539 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-scripts\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.416557 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.416587 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.416609 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.416631 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-config-data\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.416658 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.416673 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-logs\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.416694 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.416744 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-dev\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.416825 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-run\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.416904 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2rtr\" (UniqueName: \"kubernetes.io/projected/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-kube-api-access-z2rtr\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.518606 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50f06623-1b90-4e01-bf7e-2ec9a076de51-config-data\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.518657 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-run\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.518682 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-sys\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.518700 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.518734 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-run\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.518764 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.518831 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-sys\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.518844 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.519282 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.519657 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-scripts\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.519716 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.519759 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-config-data\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.519803 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.519840 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.519871 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2th4\" (UniqueName: \"kubernetes.io/projected/50f06623-1b90-4e01-bf7e-2ec9a076de51-kube-api-access-t2th4\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.519900 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-dev\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.519922 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.519948 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-dev\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.519971 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-run\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.519999 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2rtr\" (UniqueName: \"kubernetes.io/projected/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-kube-api-access-z2rtr\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.520030 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-sys\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.520065 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50f06623-1b90-4e01-bf7e-2ec9a076de51-scripts\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.520083 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-dev\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.520135 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50f06623-1b90-4e01-bf7e-2ec9a076de51-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.520153 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50f06623-1b90-4e01-bf7e-2ec9a076de51-logs\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.520173 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.520190 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.520207 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.520225 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.520247 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.520265 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-logs\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.520273 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.520299 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.520300 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.520325 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.520337 4704 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") device mount path \"/mnt/openstack/pv09\"" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.520350 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.520596 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-logs\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.520670 4704 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") device mount path \"/mnt/openstack/pv12\"" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.528108 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-scripts\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.534582 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-config-data\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.541726 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2rtr\" (UniqueName: \"kubernetes.io/projected/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-kube-api-access-z2rtr\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.550102 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.553161 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-2\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.622448 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-sys\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.622503 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50f06623-1b90-4e01-bf7e-2ec9a076de51-scripts\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.622550 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50f06623-1b90-4e01-bf7e-2ec9a076de51-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.622567 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50f06623-1b90-4e01-bf7e-2ec9a076de51-logs\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.622586 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.622603 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.622613 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-sys\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.622663 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.622684 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.622695 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.622703 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50f06623-1b90-4e01-bf7e-2ec9a076de51-config-data\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.622725 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.622769 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.622820 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2th4\" (UniqueName: \"kubernetes.io/projected/50f06623-1b90-4e01-bf7e-2ec9a076de51-kube-api-access-t2th4\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.622840 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-dev\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.622858 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.622876 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-run\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.622967 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-run\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.623090 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.623296 4704 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.623328 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50f06623-1b90-4e01-bf7e-2ec9a076de51-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.623376 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-dev\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.623473 4704 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.623516 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.623644 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50f06623-1b90-4e01-bf7e-2ec9a076de51-logs\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.629115 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50f06623-1b90-4e01-bf7e-2ec9a076de51-scripts\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.631778 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50f06623-1b90-4e01-bf7e-2ec9a076de51-config-data\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.640710 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2th4\" (UniqueName: \"kubernetes.io/projected/50f06623-1b90-4e01-bf7e-2ec9a076de51-kube-api-access-t2th4\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.648895 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.651921 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.695378 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:35 crc kubenswrapper[4704]: I1125 15:56:35.714635 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:36 crc kubenswrapper[4704]: I1125 15:56:36.144127 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Nov 25 15:56:36 crc kubenswrapper[4704]: W1125 15:56:36.147277 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5740db0f_d924_4c99_a5f8_a5b6fcf6d2d1.slice/crio-ba32e503b540df2d3d9a9c975cebace1d563128b7133d473678e22db6a2d3d5e WatchSource:0}: Error finding container ba32e503b540df2d3d9a9c975cebace1d563128b7133d473678e22db6a2d3d5e: Status 404 returned error can't find the container with id ba32e503b540df2d3d9a9c975cebace1d563128b7133d473678e22db6a2d3d5e Nov 25 15:56:36 crc kubenswrapper[4704]: I1125 15:56:36.188901 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Nov 25 15:56:36 crc kubenswrapper[4704]: W1125 15:56:36.190932 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50f06623_1b90_4e01_bf7e_2ec9a076de51.slice/crio-85b2ad59172337b02a0c4391ef05691b975f3c3e4b16238eb3c3b1ba8d7d1b56 WatchSource:0}: Error finding container 85b2ad59172337b02a0c4391ef05691b975f3c3e4b16238eb3c3b1ba8d7d1b56: Status 404 returned error can't find the container with id 85b2ad59172337b02a0c4391ef05691b975f3c3e4b16238eb3c3b1ba8d7d1b56 Nov 25 15:56:36 crc kubenswrapper[4704]: I1125 15:56:36.455980 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1","Type":"ContainerStarted","Data":"24e50b7918df0089131f79d0e9d91c4fba43f92bb57ce19fb680b2083de2f86f"} Nov 25 15:56:36 crc kubenswrapper[4704]: I1125 15:56:36.456036 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1","Type":"ContainerStarted","Data":"4db0677b72edbbc5a0a2e79a23bb39f11137ef6493ce1c831c76760a7a95ba6e"} Nov 25 15:56:36 crc kubenswrapper[4704]: I1125 15:56:36.456051 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1","Type":"ContainerStarted","Data":"ba32e503b540df2d3d9a9c975cebace1d563128b7133d473678e22db6a2d3d5e"} Nov 25 15:56:36 crc kubenswrapper[4704]: I1125 15:56:36.460121 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"50f06623-1b90-4e01-bf7e-2ec9a076de51","Type":"ContainerStarted","Data":"3753e50ca5b6836d1569754164ebfba39f94ed2a061ff439f703ef34e18b77a1"} Nov 25 15:56:36 crc kubenswrapper[4704]: I1125 15:56:36.460191 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"50f06623-1b90-4e01-bf7e-2ec9a076de51","Type":"ContainerStarted","Data":"d083ef3e0a40aa85473df44c192356a13c5685ae70d580460fe4f1d08c1bcbd5"} Nov 25 15:56:36 crc kubenswrapper[4704]: I1125 15:56:36.460204 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"50f06623-1b90-4e01-bf7e-2ec9a076de51","Type":"ContainerStarted","Data":"85b2ad59172337b02a0c4391ef05691b975f3c3e4b16238eb3c3b1ba8d7d1b56"} Nov 25 15:56:37 crc kubenswrapper[4704]: I1125 15:56:37.468725 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"50f06623-1b90-4e01-bf7e-2ec9a076de51","Type":"ContainerStarted","Data":"53e0f5e632e6350bc8fe272187238cdd9ad93684c0205dd237dfe1e83e4088c8"} Nov 25 15:56:37 crc kubenswrapper[4704]: I1125 15:56:37.473247 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1","Type":"ContainerStarted","Data":"e20673c20d7c0311239a889c402809af197e5e17642c1d0f8a07b0ed4b84a282"} Nov 25 15:56:37 crc kubenswrapper[4704]: I1125 15:56:37.531323 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-1" podStartSLOduration=3.531304145 podStartE2EDuration="3.531304145s" podCreationTimestamp="2025-11-25 15:56:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:56:37.508624172 +0000 UTC m=+1283.776898143" watchObservedRunningTime="2025-11-25 15:56:37.531304145 +0000 UTC m=+1283.799577946" Nov 25 15:56:37 crc kubenswrapper[4704]: I1125 15:56:37.531821 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-2" podStartSLOduration=3.53181609 podStartE2EDuration="3.53181609s" podCreationTimestamp="2025-11-25 15:56:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 15:56:37.527900747 +0000 UTC m=+1283.796174528" watchObservedRunningTime="2025-11-25 15:56:37.53181609 +0000 UTC m=+1283.800089871" Nov 25 15:56:37 crc kubenswrapper[4704]: I1125 15:56:37.964482 4704 patch_prober.go:28] interesting pod/machine-config-daemon-djz8x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:56:37 crc kubenswrapper[4704]: I1125 15:56:37.965054 4704 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:56:37 crc kubenswrapper[4704]: I1125 15:56:37.965106 4704 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" Nov 25 15:56:37 crc kubenswrapper[4704]: I1125 15:56:37.965905 4704 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ed4f107353622069826562153315aa9eb23b779c9df0b35ea109bbd82177caad"} pod="openshift-machine-config-operator/machine-config-daemon-djz8x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 15:56:37 crc kubenswrapper[4704]: I1125 15:56:37.965967 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" containerID="cri-o://ed4f107353622069826562153315aa9eb23b779c9df0b35ea109bbd82177caad" gracePeriod=600 Nov 25 15:56:38 crc kubenswrapper[4704]: I1125 15:56:38.484582 4704 generic.go:334] "Generic (PLEG): container finished" podID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerID="ed4f107353622069826562153315aa9eb23b779c9df0b35ea109bbd82177caad" exitCode=0 Nov 25 15:56:38 crc kubenswrapper[4704]: I1125 15:56:38.486007 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" event={"ID":"91b52682-d008-4b8a-8bc3-26b032d7dc2c","Type":"ContainerDied","Data":"ed4f107353622069826562153315aa9eb23b779c9df0b35ea109bbd82177caad"} Nov 25 15:56:38 crc kubenswrapper[4704]: I1125 15:56:38.486037 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" event={"ID":"91b52682-d008-4b8a-8bc3-26b032d7dc2c","Type":"ContainerStarted","Data":"7a574ca126e29dffea7335dc4eb45068ba5d1201355a4dba5124c9c343f20921"} Nov 25 15:56:38 crc kubenswrapper[4704]: I1125 15:56:38.486055 4704 scope.go:117] "RemoveContainer" containerID="0a8966b76dc1d40a4bda67fc26f25a19803f2f36d74b3a7ae6b45d74acb00ad9" Nov 25 15:56:45 crc kubenswrapper[4704]: I1125 15:56:45.695807 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:45 crc kubenswrapper[4704]: I1125 15:56:45.696775 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:45 crc kubenswrapper[4704]: I1125 15:56:45.696806 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:45 crc kubenswrapper[4704]: I1125 15:56:45.715965 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:45 crc kubenswrapper[4704]: I1125 15:56:45.716023 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:45 crc kubenswrapper[4704]: I1125 15:56:45.716035 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:45 crc kubenswrapper[4704]: I1125 15:56:45.723729 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:45 crc kubenswrapper[4704]: I1125 15:56:45.729418 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:45 crc kubenswrapper[4704]: I1125 15:56:45.757528 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:45 crc kubenswrapper[4704]: I1125 15:56:45.759482 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:45 crc kubenswrapper[4704]: I1125 15:56:45.759652 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:45 crc kubenswrapper[4704]: I1125 15:56:45.797415 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:46 crc kubenswrapper[4704]: I1125 15:56:46.584733 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:46 crc kubenswrapper[4704]: I1125 15:56:46.585038 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:46 crc kubenswrapper[4704]: I1125 15:56:46.585100 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:46 crc kubenswrapper[4704]: I1125 15:56:46.585203 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:46 crc kubenswrapper[4704]: I1125 15:56:46.585259 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:46 crc kubenswrapper[4704]: I1125 15:56:46.585324 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:46 crc kubenswrapper[4704]: I1125 15:56:46.596808 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:46 crc kubenswrapper[4704]: I1125 15:56:46.597637 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:46 crc kubenswrapper[4704]: I1125 15:56:46.599674 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:46 crc kubenswrapper[4704]: I1125 15:56:46.600945 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:56:46 crc kubenswrapper[4704]: I1125 15:56:46.601290 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:56:46 crc kubenswrapper[4704]: I1125 15:56:46.601829 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:59:07 crc kubenswrapper[4704]: I1125 15:59:07.964494 4704 patch_prober.go:28] interesting pod/machine-config-daemon-djz8x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:59:07 crc kubenswrapper[4704]: I1125 15:59:07.965348 4704 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:59:10 crc kubenswrapper[4704]: I1125 15:59:10.763983 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xzjkk"] Nov 25 15:59:10 crc kubenswrapper[4704]: I1125 15:59:10.766284 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xzjkk" Nov 25 15:59:10 crc kubenswrapper[4704]: I1125 15:59:10.776416 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xzjkk"] Nov 25 15:59:10 crc kubenswrapper[4704]: I1125 15:59:10.935536 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a58a0eb-db99-48d3-aa22-c9ff8fc874c8-catalog-content\") pod \"community-operators-xzjkk\" (UID: \"4a58a0eb-db99-48d3-aa22-c9ff8fc874c8\") " pod="openshift-marketplace/community-operators-xzjkk" Nov 25 15:59:10 crc kubenswrapper[4704]: I1125 15:59:10.935887 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzbcm\" (UniqueName: \"kubernetes.io/projected/4a58a0eb-db99-48d3-aa22-c9ff8fc874c8-kube-api-access-hzbcm\") pod \"community-operators-xzjkk\" (UID: \"4a58a0eb-db99-48d3-aa22-c9ff8fc874c8\") " pod="openshift-marketplace/community-operators-xzjkk" Nov 25 15:59:10 crc kubenswrapper[4704]: I1125 15:59:10.936024 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a58a0eb-db99-48d3-aa22-c9ff8fc874c8-utilities\") pod \"community-operators-xzjkk\" (UID: \"4a58a0eb-db99-48d3-aa22-c9ff8fc874c8\") " pod="openshift-marketplace/community-operators-xzjkk" Nov 25 15:59:11 crc kubenswrapper[4704]: I1125 15:59:11.037151 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzbcm\" (UniqueName: \"kubernetes.io/projected/4a58a0eb-db99-48d3-aa22-c9ff8fc874c8-kube-api-access-hzbcm\") pod \"community-operators-xzjkk\" (UID: \"4a58a0eb-db99-48d3-aa22-c9ff8fc874c8\") " pod="openshift-marketplace/community-operators-xzjkk" Nov 25 15:59:11 crc kubenswrapper[4704]: I1125 15:59:11.037216 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a58a0eb-db99-48d3-aa22-c9ff8fc874c8-utilities\") pod \"community-operators-xzjkk\" (UID: \"4a58a0eb-db99-48d3-aa22-c9ff8fc874c8\") " pod="openshift-marketplace/community-operators-xzjkk" Nov 25 15:59:11 crc kubenswrapper[4704]: I1125 15:59:11.037243 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a58a0eb-db99-48d3-aa22-c9ff8fc874c8-catalog-content\") pod \"community-operators-xzjkk\" (UID: \"4a58a0eb-db99-48d3-aa22-c9ff8fc874c8\") " pod="openshift-marketplace/community-operators-xzjkk" Nov 25 15:59:11 crc kubenswrapper[4704]: I1125 15:59:11.037817 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a58a0eb-db99-48d3-aa22-c9ff8fc874c8-catalog-content\") pod \"community-operators-xzjkk\" (UID: \"4a58a0eb-db99-48d3-aa22-c9ff8fc874c8\") " pod="openshift-marketplace/community-operators-xzjkk" Nov 25 15:59:11 crc kubenswrapper[4704]: I1125 15:59:11.038428 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a58a0eb-db99-48d3-aa22-c9ff8fc874c8-utilities\") pod \"community-operators-xzjkk\" (UID: \"4a58a0eb-db99-48d3-aa22-c9ff8fc874c8\") " pod="openshift-marketplace/community-operators-xzjkk" Nov 25 15:59:11 crc kubenswrapper[4704]: I1125 15:59:11.066834 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzbcm\" (UniqueName: \"kubernetes.io/projected/4a58a0eb-db99-48d3-aa22-c9ff8fc874c8-kube-api-access-hzbcm\") pod \"community-operators-xzjkk\" (UID: \"4a58a0eb-db99-48d3-aa22-c9ff8fc874c8\") " pod="openshift-marketplace/community-operators-xzjkk" Nov 25 15:59:11 crc kubenswrapper[4704]: I1125 15:59:11.128014 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xzjkk" Nov 25 15:59:11 crc kubenswrapper[4704]: I1125 15:59:11.635646 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xzjkk"] Nov 25 15:59:11 crc kubenswrapper[4704]: I1125 15:59:11.905555 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xzjkk" event={"ID":"4a58a0eb-db99-48d3-aa22-c9ff8fc874c8","Type":"ContainerStarted","Data":"dd07ad8db612a2e3abfde431dc6ef4dafa47fbbffccd69226121e433fd0ed004"} Nov 25 15:59:11 crc kubenswrapper[4704]: I1125 15:59:11.905616 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xzjkk" event={"ID":"4a58a0eb-db99-48d3-aa22-c9ff8fc874c8","Type":"ContainerStarted","Data":"6090042302e3a5d454cc3daafb4f5344e7f76e189721d5fb536d839d2c1e3ecb"} Nov 25 15:59:12 crc kubenswrapper[4704]: I1125 15:59:12.915005 4704 generic.go:334] "Generic (PLEG): container finished" podID="4a58a0eb-db99-48d3-aa22-c9ff8fc874c8" containerID="dd07ad8db612a2e3abfde431dc6ef4dafa47fbbffccd69226121e433fd0ed004" exitCode=0 Nov 25 15:59:12 crc kubenswrapper[4704]: I1125 15:59:12.915110 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xzjkk" event={"ID":"4a58a0eb-db99-48d3-aa22-c9ff8fc874c8","Type":"ContainerDied","Data":"dd07ad8db612a2e3abfde431dc6ef4dafa47fbbffccd69226121e433fd0ed004"} Nov 25 15:59:12 crc kubenswrapper[4704]: I1125 15:59:12.917825 4704 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 15:59:14 crc kubenswrapper[4704]: I1125 15:59:14.936337 4704 generic.go:334] "Generic (PLEG): container finished" podID="4a58a0eb-db99-48d3-aa22-c9ff8fc874c8" containerID="7fd1aff5b3010fbf244f5d1d1e1e7f86d353cb1b03cd542131a11d9dbd3c7d95" exitCode=0 Nov 25 15:59:14 crc kubenswrapper[4704]: I1125 15:59:14.936425 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xzjkk" event={"ID":"4a58a0eb-db99-48d3-aa22-c9ff8fc874c8","Type":"ContainerDied","Data":"7fd1aff5b3010fbf244f5d1d1e1e7f86d353cb1b03cd542131a11d9dbd3c7d95"} Nov 25 15:59:15 crc kubenswrapper[4704]: I1125 15:59:15.948237 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xzjkk" event={"ID":"4a58a0eb-db99-48d3-aa22-c9ff8fc874c8","Type":"ContainerStarted","Data":"bfb2e7ad3bcc7cfaa618cf727582350e4bf4d35a939c24e125c0846112348108"} Nov 25 15:59:15 crc kubenswrapper[4704]: I1125 15:59:15.972743 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xzjkk" podStartSLOduration=3.332028987 podStartE2EDuration="5.972720337s" podCreationTimestamp="2025-11-25 15:59:10 +0000 UTC" firstStartedPulling="2025-11-25 15:59:12.917552464 +0000 UTC m=+1439.185826245" lastFinishedPulling="2025-11-25 15:59:15.558243814 +0000 UTC m=+1441.826517595" observedRunningTime="2025-11-25 15:59:15.969168104 +0000 UTC m=+1442.237441885" watchObservedRunningTime="2025-11-25 15:59:15.972720337 +0000 UTC m=+1442.240994118" Nov 25 15:59:21 crc kubenswrapper[4704]: I1125 15:59:21.129023 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xzjkk" Nov 25 15:59:21 crc kubenswrapper[4704]: I1125 15:59:21.130444 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xzjkk" Nov 25 15:59:21 crc kubenswrapper[4704]: I1125 15:59:21.190459 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xzjkk" Nov 25 15:59:22 crc kubenswrapper[4704]: I1125 15:59:22.027993 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xzjkk" Nov 25 15:59:22 crc kubenswrapper[4704]: I1125 15:59:22.075562 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xzjkk"] Nov 25 15:59:24 crc kubenswrapper[4704]: I1125 15:59:24.014226 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xzjkk" podUID="4a58a0eb-db99-48d3-aa22-c9ff8fc874c8" containerName="registry-server" containerID="cri-o://bfb2e7ad3bcc7cfaa618cf727582350e4bf4d35a939c24e125c0846112348108" gracePeriod=2 Nov 25 15:59:24 crc kubenswrapper[4704]: I1125 15:59:24.441875 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xzjkk" Nov 25 15:59:24 crc kubenswrapper[4704]: I1125 15:59:24.551299 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a58a0eb-db99-48d3-aa22-c9ff8fc874c8-catalog-content\") pod \"4a58a0eb-db99-48d3-aa22-c9ff8fc874c8\" (UID: \"4a58a0eb-db99-48d3-aa22-c9ff8fc874c8\") " Nov 25 15:59:24 crc kubenswrapper[4704]: I1125 15:59:24.551562 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a58a0eb-db99-48d3-aa22-c9ff8fc874c8-utilities\") pod \"4a58a0eb-db99-48d3-aa22-c9ff8fc874c8\" (UID: \"4a58a0eb-db99-48d3-aa22-c9ff8fc874c8\") " Nov 25 15:59:24 crc kubenswrapper[4704]: I1125 15:59:24.551634 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzbcm\" (UniqueName: \"kubernetes.io/projected/4a58a0eb-db99-48d3-aa22-c9ff8fc874c8-kube-api-access-hzbcm\") pod \"4a58a0eb-db99-48d3-aa22-c9ff8fc874c8\" (UID: \"4a58a0eb-db99-48d3-aa22-c9ff8fc874c8\") " Nov 25 15:59:24 crc kubenswrapper[4704]: I1125 15:59:24.552645 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a58a0eb-db99-48d3-aa22-c9ff8fc874c8-utilities" (OuterVolumeSpecName: "utilities") pod "4a58a0eb-db99-48d3-aa22-c9ff8fc874c8" (UID: "4a58a0eb-db99-48d3-aa22-c9ff8fc874c8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:59:24 crc kubenswrapper[4704]: I1125 15:59:24.554107 4704 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a58a0eb-db99-48d3-aa22-c9ff8fc874c8-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:24 crc kubenswrapper[4704]: I1125 15:59:24.558342 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a58a0eb-db99-48d3-aa22-c9ff8fc874c8-kube-api-access-hzbcm" (OuterVolumeSpecName: "kube-api-access-hzbcm") pod "4a58a0eb-db99-48d3-aa22-c9ff8fc874c8" (UID: "4a58a0eb-db99-48d3-aa22-c9ff8fc874c8"). InnerVolumeSpecName "kube-api-access-hzbcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:24 crc kubenswrapper[4704]: I1125 15:59:24.598172 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a58a0eb-db99-48d3-aa22-c9ff8fc874c8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a58a0eb-db99-48d3-aa22-c9ff8fc874c8" (UID: "4a58a0eb-db99-48d3-aa22-c9ff8fc874c8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:59:24 crc kubenswrapper[4704]: I1125 15:59:24.655118 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzbcm\" (UniqueName: \"kubernetes.io/projected/4a58a0eb-db99-48d3-aa22-c9ff8fc874c8-kube-api-access-hzbcm\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:24 crc kubenswrapper[4704]: I1125 15:59:24.655178 4704 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a58a0eb-db99-48d3-aa22-c9ff8fc874c8-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:25 crc kubenswrapper[4704]: I1125 15:59:25.022600 4704 generic.go:334] "Generic (PLEG): container finished" podID="4a58a0eb-db99-48d3-aa22-c9ff8fc874c8" containerID="bfb2e7ad3bcc7cfaa618cf727582350e4bf4d35a939c24e125c0846112348108" exitCode=0 Nov 25 15:59:25 crc kubenswrapper[4704]: I1125 15:59:25.022657 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xzjkk" event={"ID":"4a58a0eb-db99-48d3-aa22-c9ff8fc874c8","Type":"ContainerDied","Data":"bfb2e7ad3bcc7cfaa618cf727582350e4bf4d35a939c24e125c0846112348108"} Nov 25 15:59:25 crc kubenswrapper[4704]: I1125 15:59:25.022681 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xzjkk" Nov 25 15:59:25 crc kubenswrapper[4704]: I1125 15:59:25.022703 4704 scope.go:117] "RemoveContainer" containerID="bfb2e7ad3bcc7cfaa618cf727582350e4bf4d35a939c24e125c0846112348108" Nov 25 15:59:25 crc kubenswrapper[4704]: I1125 15:59:25.022689 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xzjkk" event={"ID":"4a58a0eb-db99-48d3-aa22-c9ff8fc874c8","Type":"ContainerDied","Data":"6090042302e3a5d454cc3daafb4f5344e7f76e189721d5fb536d839d2c1e3ecb"} Nov 25 15:59:25 crc kubenswrapper[4704]: I1125 15:59:25.055777 4704 scope.go:117] "RemoveContainer" containerID="7fd1aff5b3010fbf244f5d1d1e1e7f86d353cb1b03cd542131a11d9dbd3c7d95" Nov 25 15:59:25 crc kubenswrapper[4704]: I1125 15:59:25.061325 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xzjkk"] Nov 25 15:59:25 crc kubenswrapper[4704]: I1125 15:59:25.068908 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xzjkk"] Nov 25 15:59:25 crc kubenswrapper[4704]: I1125 15:59:25.084899 4704 scope.go:117] "RemoveContainer" containerID="dd07ad8db612a2e3abfde431dc6ef4dafa47fbbffccd69226121e433fd0ed004" Nov 25 15:59:25 crc kubenswrapper[4704]: I1125 15:59:25.111106 4704 scope.go:117] "RemoveContainer" containerID="bfb2e7ad3bcc7cfaa618cf727582350e4bf4d35a939c24e125c0846112348108" Nov 25 15:59:25 crc kubenswrapper[4704]: E1125 15:59:25.111527 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfb2e7ad3bcc7cfaa618cf727582350e4bf4d35a939c24e125c0846112348108\": container with ID starting with bfb2e7ad3bcc7cfaa618cf727582350e4bf4d35a939c24e125c0846112348108 not found: ID does not exist" containerID="bfb2e7ad3bcc7cfaa618cf727582350e4bf4d35a939c24e125c0846112348108" Nov 25 15:59:25 crc kubenswrapper[4704]: I1125 15:59:25.111561 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfb2e7ad3bcc7cfaa618cf727582350e4bf4d35a939c24e125c0846112348108"} err="failed to get container status \"bfb2e7ad3bcc7cfaa618cf727582350e4bf4d35a939c24e125c0846112348108\": rpc error: code = NotFound desc = could not find container \"bfb2e7ad3bcc7cfaa618cf727582350e4bf4d35a939c24e125c0846112348108\": container with ID starting with bfb2e7ad3bcc7cfaa618cf727582350e4bf4d35a939c24e125c0846112348108 not found: ID does not exist" Nov 25 15:59:25 crc kubenswrapper[4704]: I1125 15:59:25.111581 4704 scope.go:117] "RemoveContainer" containerID="7fd1aff5b3010fbf244f5d1d1e1e7f86d353cb1b03cd542131a11d9dbd3c7d95" Nov 25 15:59:25 crc kubenswrapper[4704]: E1125 15:59:25.111893 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fd1aff5b3010fbf244f5d1d1e1e7f86d353cb1b03cd542131a11d9dbd3c7d95\": container with ID starting with 7fd1aff5b3010fbf244f5d1d1e1e7f86d353cb1b03cd542131a11d9dbd3c7d95 not found: ID does not exist" containerID="7fd1aff5b3010fbf244f5d1d1e1e7f86d353cb1b03cd542131a11d9dbd3c7d95" Nov 25 15:59:25 crc kubenswrapper[4704]: I1125 15:59:25.111937 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fd1aff5b3010fbf244f5d1d1e1e7f86d353cb1b03cd542131a11d9dbd3c7d95"} err="failed to get container status \"7fd1aff5b3010fbf244f5d1d1e1e7f86d353cb1b03cd542131a11d9dbd3c7d95\": rpc error: code = NotFound desc = could not find container \"7fd1aff5b3010fbf244f5d1d1e1e7f86d353cb1b03cd542131a11d9dbd3c7d95\": container with ID starting with 7fd1aff5b3010fbf244f5d1d1e1e7f86d353cb1b03cd542131a11d9dbd3c7d95 not found: ID does not exist" Nov 25 15:59:25 crc kubenswrapper[4704]: I1125 15:59:25.111967 4704 scope.go:117] "RemoveContainer" containerID="dd07ad8db612a2e3abfde431dc6ef4dafa47fbbffccd69226121e433fd0ed004" Nov 25 15:59:25 crc kubenswrapper[4704]: E1125 15:59:25.112429 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd07ad8db612a2e3abfde431dc6ef4dafa47fbbffccd69226121e433fd0ed004\": container with ID starting with dd07ad8db612a2e3abfde431dc6ef4dafa47fbbffccd69226121e433fd0ed004 not found: ID does not exist" containerID="dd07ad8db612a2e3abfde431dc6ef4dafa47fbbffccd69226121e433fd0ed004" Nov 25 15:59:25 crc kubenswrapper[4704]: I1125 15:59:25.112458 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd07ad8db612a2e3abfde431dc6ef4dafa47fbbffccd69226121e433fd0ed004"} err="failed to get container status \"dd07ad8db612a2e3abfde431dc6ef4dafa47fbbffccd69226121e433fd0ed004\": rpc error: code = NotFound desc = could not find container \"dd07ad8db612a2e3abfde431dc6ef4dafa47fbbffccd69226121e433fd0ed004\": container with ID starting with dd07ad8db612a2e3abfde431dc6ef4dafa47fbbffccd69226121e433fd0ed004 not found: ID does not exist" Nov 25 15:59:26 crc kubenswrapper[4704]: I1125 15:59:26.429512 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a58a0eb-db99-48d3-aa22-c9ff8fc874c8" path="/var/lib/kubelet/pods/4a58a0eb-db99-48d3-aa22-c9ff8fc874c8/volumes" Nov 25 15:59:37 crc kubenswrapper[4704]: I1125 15:59:37.720196 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Nov 25 15:59:37 crc kubenswrapper[4704]: I1125 15:59:37.721243 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" containerName="glance-log" containerID="cri-o://4db0677b72edbbc5a0a2e79a23bb39f11137ef6493ce1c831c76760a7a95ba6e" gracePeriod=30 Nov 25 15:59:37 crc kubenswrapper[4704]: I1125 15:59:37.721904 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" containerName="glance-api" containerID="cri-o://e20673c20d7c0311239a889c402809af197e5e17642c1d0f8a07b0ed4b84a282" gracePeriod=30 Nov 25 15:59:37 crc kubenswrapper[4704]: I1125 15:59:37.722076 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" containerName="glance-httpd" containerID="cri-o://24e50b7918df0089131f79d0e9d91c4fba43f92bb57ce19fb680b2083de2f86f" gracePeriod=30 Nov 25 15:59:37 crc kubenswrapper[4704]: I1125 15:59:37.725117 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Nov 25 15:59:37 crc kubenswrapper[4704]: I1125 15:59:37.725431 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="50f06623-1b90-4e01-bf7e-2ec9a076de51" containerName="glance-log" containerID="cri-o://d083ef3e0a40aa85473df44c192356a13c5685ae70d580460fe4f1d08c1bcbd5" gracePeriod=30 Nov 25 15:59:37 crc kubenswrapper[4704]: I1125 15:59:37.725555 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="50f06623-1b90-4e01-bf7e-2ec9a076de51" containerName="glance-httpd" containerID="cri-o://3753e50ca5b6836d1569754164ebfba39f94ed2a061ff439f703ef34e18b77a1" gracePeriod=30 Nov 25 15:59:37 crc kubenswrapper[4704]: I1125 15:59:37.725568 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="50f06623-1b90-4e01-bf7e-2ec9a076de51" containerName="glance-api" containerID="cri-o://53e0f5e632e6350bc8fe272187238cdd9ad93684c0205dd237dfe1e83e4088c8" gracePeriod=30 Nov 25 15:59:37 crc kubenswrapper[4704]: I1125 15:59:37.964269 4704 patch_prober.go:28] interesting pod/machine-config-daemon-djz8x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 15:59:37 crc kubenswrapper[4704]: I1125 15:59:37.964330 4704 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.120541 4704 generic.go:334] "Generic (PLEG): container finished" podID="50f06623-1b90-4e01-bf7e-2ec9a076de51" containerID="3753e50ca5b6836d1569754164ebfba39f94ed2a061ff439f703ef34e18b77a1" exitCode=0 Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.121080 4704 generic.go:334] "Generic (PLEG): container finished" podID="50f06623-1b90-4e01-bf7e-2ec9a076de51" containerID="d083ef3e0a40aa85473df44c192356a13c5685ae70d580460fe4f1d08c1bcbd5" exitCode=143 Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.120620 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"50f06623-1b90-4e01-bf7e-2ec9a076de51","Type":"ContainerDied","Data":"3753e50ca5b6836d1569754164ebfba39f94ed2a061ff439f703ef34e18b77a1"} Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.121177 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"50f06623-1b90-4e01-bf7e-2ec9a076de51","Type":"ContainerDied","Data":"d083ef3e0a40aa85473df44c192356a13c5685ae70d580460fe4f1d08c1bcbd5"} Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.123617 4704 generic.go:334] "Generic (PLEG): container finished" podID="5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" containerID="24e50b7918df0089131f79d0e9d91c4fba43f92bb57ce19fb680b2083de2f86f" exitCode=0 Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.123644 4704 generic.go:334] "Generic (PLEG): container finished" podID="5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" containerID="4db0677b72edbbc5a0a2e79a23bb39f11137ef6493ce1c831c76760a7a95ba6e" exitCode=143 Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.123666 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1","Type":"ContainerDied","Data":"24e50b7918df0089131f79d0e9d91c4fba43f92bb57ce19fb680b2083de2f86f"} Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.123693 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1","Type":"ContainerDied","Data":"4db0677b72edbbc5a0a2e79a23bb39f11137ef6493ce1c831c76760a7a95ba6e"} Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.632705 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.701830 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.772744 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-var-locks-brick\") pod \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.773300 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-etc-nvme\") pod \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.773321 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-run\") pod \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.772935 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" (UID: "5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.773348 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" (UID: "5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.773382 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-scripts\") pod \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.773438 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-run" (OuterVolumeSpecName: "run") pod "5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" (UID: "5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.773540 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-config-data\") pod \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.773576 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-httpd-run\") pod \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.774715 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.774735 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-dev\") pod \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.774757 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-sys\") pod \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.774801 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.774820 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-logs\") pod \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.774881 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2rtr\" (UniqueName: \"kubernetes.io/projected/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-kube-api-access-z2rtr\") pod \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.774910 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-etc-iscsi\") pod \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.774928 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-lib-modules\") pod \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\" (UID: \"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1\") " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.774877 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" (UID: "5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.774887 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-sys" (OuterVolumeSpecName: "sys") pod "5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" (UID: "5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.774908 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-dev" (OuterVolumeSpecName: "dev") pod "5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" (UID: "5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.775064 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" (UID: "5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.775150 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" (UID: "5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.775302 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-logs" (OuterVolumeSpecName: "logs") pod "5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" (UID: "5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.775643 4704 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.775658 4704 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.775667 4704 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.775678 4704 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.775687 4704 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-run\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.775698 4704 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.775711 4704 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-dev\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.775719 4704 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-sys\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.775728 4704 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-logs\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.780900 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-kube-api-access-z2rtr" (OuterVolumeSpecName: "kube-api-access-z2rtr") pod "5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" (UID: "5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1"). InnerVolumeSpecName "kube-api-access-z2rtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.782608 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance-cache") pod "5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" (UID: "5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.782608 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-scripts" (OuterVolumeSpecName: "scripts") pod "5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" (UID: "5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.782774 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" (UID: "5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.878042 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50f06623-1b90-4e01-bf7e-2ec9a076de51-scripts\") pod \"50f06623-1b90-4e01-bf7e-2ec9a076de51\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.878116 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50f06623-1b90-4e01-bf7e-2ec9a076de51-httpd-run\") pod \"50f06623-1b90-4e01-bf7e-2ec9a076de51\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.878152 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-etc-iscsi\") pod \"50f06623-1b90-4e01-bf7e-2ec9a076de51\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.878203 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-etc-nvme\") pod \"50f06623-1b90-4e01-bf7e-2ec9a076de51\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.878222 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-run\") pod \"50f06623-1b90-4e01-bf7e-2ec9a076de51\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.878242 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"50f06623-1b90-4e01-bf7e-2ec9a076de51\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.878263 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50f06623-1b90-4e01-bf7e-2ec9a076de51-logs\") pod \"50f06623-1b90-4e01-bf7e-2ec9a076de51\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.878319 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50f06623-1b90-4e01-bf7e-2ec9a076de51-config-data\") pod \"50f06623-1b90-4e01-bf7e-2ec9a076de51\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.878340 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-lib-modules\") pod \"50f06623-1b90-4e01-bf7e-2ec9a076de51\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.878361 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2th4\" (UniqueName: \"kubernetes.io/projected/50f06623-1b90-4e01-bf7e-2ec9a076de51-kube-api-access-t2th4\") pod \"50f06623-1b90-4e01-bf7e-2ec9a076de51\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.878386 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-dev\") pod \"50f06623-1b90-4e01-bf7e-2ec9a076de51\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.878423 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-var-locks-brick\") pod \"50f06623-1b90-4e01-bf7e-2ec9a076de51\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.878461 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-sys\") pod \"50f06623-1b90-4e01-bf7e-2ec9a076de51\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.878488 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"50f06623-1b90-4e01-bf7e-2ec9a076de51\" (UID: \"50f06623-1b90-4e01-bf7e-2ec9a076de51\") " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.878767 4704 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.879367 4704 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.879393 4704 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.879405 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2rtr\" (UniqueName: \"kubernetes.io/projected/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-kube-api-access-z2rtr\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.886915 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "50f06623-1b90-4e01-bf7e-2ec9a076de51" (UID: "50f06623-1b90-4e01-bf7e-2ec9a076de51"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.887009 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "50f06623-1b90-4e01-bf7e-2ec9a076de51" (UID: "50f06623-1b90-4e01-bf7e-2ec9a076de51"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.890076 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "50f06623-1b90-4e01-bf7e-2ec9a076de51" (UID: "50f06623-1b90-4e01-bf7e-2ec9a076de51"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.890487 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50f06623-1b90-4e01-bf7e-2ec9a076de51-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "50f06623-1b90-4e01-bf7e-2ec9a076de51" (UID: "50f06623-1b90-4e01-bf7e-2ec9a076de51"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.890527 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "50f06623-1b90-4e01-bf7e-2ec9a076de51" (UID: "50f06623-1b90-4e01-bf7e-2ec9a076de51"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.898003 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f06623-1b90-4e01-bf7e-2ec9a076de51-scripts" (OuterVolumeSpecName: "scripts") pod "50f06623-1b90-4e01-bf7e-2ec9a076de51" (UID: "50f06623-1b90-4e01-bf7e-2ec9a076de51"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.905628 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-sys" (OuterVolumeSpecName: "sys") pod "50f06623-1b90-4e01-bf7e-2ec9a076de51" (UID: "50f06623-1b90-4e01-bf7e-2ec9a076de51"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.906068 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50f06623-1b90-4e01-bf7e-2ec9a076de51-logs" (OuterVolumeSpecName: "logs") pod "50f06623-1b90-4e01-bf7e-2ec9a076de51" (UID: "50f06623-1b90-4e01-bf7e-2ec9a076de51"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.906105 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-dev" (OuterVolumeSpecName: "dev") pod "50f06623-1b90-4e01-bf7e-2ec9a076de51" (UID: "50f06623-1b90-4e01-bf7e-2ec9a076de51"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.906126 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-run" (OuterVolumeSpecName: "run") pod "50f06623-1b90-4e01-bf7e-2ec9a076de51" (UID: "50f06623-1b90-4e01-bf7e-2ec9a076de51"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.966801 4704 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.967941 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance-cache") pod "50f06623-1b90-4e01-bf7e-2ec9a076de51" (UID: "50f06623-1b90-4e01-bf7e-2ec9a076de51"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.969371 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50f06623-1b90-4e01-bf7e-2ec9a076de51-kube-api-access-t2th4" (OuterVolumeSpecName: "kube-api-access-t2th4") pod "50f06623-1b90-4e01-bf7e-2ec9a076de51" (UID: "50f06623-1b90-4e01-bf7e-2ec9a076de51"). InnerVolumeSpecName "kube-api-access-t2th4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.970347 4704 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.980888 4704 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.980927 4704 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.980938 4704 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-run\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.980948 4704 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50f06623-1b90-4e01-bf7e-2ec9a076de51-logs\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.980956 4704 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.980966 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2th4\" (UniqueName: \"kubernetes.io/projected/50f06623-1b90-4e01-bf7e-2ec9a076de51-kube-api-access-t2th4\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.980975 4704 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-dev\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.980984 4704 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.980993 4704 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.981002 4704 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.981010 4704 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50f06623-1b90-4e01-bf7e-2ec9a076de51-sys\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.981031 4704 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.981039 4704 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50f06623-1b90-4e01-bf7e-2ec9a076de51-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.981047 4704 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50f06623-1b90-4e01-bf7e-2ec9a076de51-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.986008 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "50f06623-1b90-4e01-bf7e-2ec9a076de51" (UID: "50f06623-1b90-4e01-bf7e-2ec9a076de51"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 15:59:38 crc kubenswrapper[4704]: I1125 15:59:38.997574 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-config-data" (OuterVolumeSpecName: "config-data") pod "5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" (UID: "5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.006730 4704 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.074674 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f06623-1b90-4e01-bf7e-2ec9a076de51-config-data" (OuterVolumeSpecName: "config-data") pod "50f06623-1b90-4e01-bf7e-2ec9a076de51" (UID: "50f06623-1b90-4e01-bf7e-2ec9a076de51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.082589 4704 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.082658 4704 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.082674 4704 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.082685 4704 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50f06623-1b90-4e01-bf7e-2ec9a076de51-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.098555 4704 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.133999 4704 generic.go:334] "Generic (PLEG): container finished" podID="50f06623-1b90-4e01-bf7e-2ec9a076de51" containerID="53e0f5e632e6350bc8fe272187238cdd9ad93684c0205dd237dfe1e83e4088c8" exitCode=0 Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.134077 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"50f06623-1b90-4e01-bf7e-2ec9a076de51","Type":"ContainerDied","Data":"53e0f5e632e6350bc8fe272187238cdd9ad93684c0205dd237dfe1e83e4088c8"} Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.134110 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"50f06623-1b90-4e01-bf7e-2ec9a076de51","Type":"ContainerDied","Data":"85b2ad59172337b02a0c4391ef05691b975f3c3e4b16238eb3c3b1ba8d7d1b56"} Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.134134 4704 scope.go:117] "RemoveContainer" containerID="53e0f5e632e6350bc8fe272187238cdd9ad93684c0205dd237dfe1e83e4088c8" Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.134316 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.141010 4704 generic.go:334] "Generic (PLEG): container finished" podID="5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" containerID="e20673c20d7c0311239a889c402809af197e5e17642c1d0f8a07b0ed4b84a282" exitCode=0 Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.141056 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1","Type":"ContainerDied","Data":"e20673c20d7c0311239a889c402809af197e5e17642c1d0f8a07b0ed4b84a282"} Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.141085 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1","Type":"ContainerDied","Data":"ba32e503b540df2d3d9a9c975cebace1d563128b7133d473678e22db6a2d3d5e"} Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.141110 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.159559 4704 scope.go:117] "RemoveContainer" containerID="3753e50ca5b6836d1569754164ebfba39f94ed2a061ff439f703ef34e18b77a1" Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.184584 4704 scope.go:117] "RemoveContainer" containerID="d083ef3e0a40aa85473df44c192356a13c5685ae70d580460fe4f1d08c1bcbd5" Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.184851 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.197148 4704 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.206745 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.213233 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.218271 4704 scope.go:117] "RemoveContainer" containerID="53e0f5e632e6350bc8fe272187238cdd9ad93684c0205dd237dfe1e83e4088c8" Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.218406 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Nov 25 15:59:39 crc kubenswrapper[4704]: E1125 15:59:39.218878 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53e0f5e632e6350bc8fe272187238cdd9ad93684c0205dd237dfe1e83e4088c8\": container with ID starting with 53e0f5e632e6350bc8fe272187238cdd9ad93684c0205dd237dfe1e83e4088c8 not found: ID does not exist" containerID="53e0f5e632e6350bc8fe272187238cdd9ad93684c0205dd237dfe1e83e4088c8" Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.218928 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53e0f5e632e6350bc8fe272187238cdd9ad93684c0205dd237dfe1e83e4088c8"} err="failed to get container status \"53e0f5e632e6350bc8fe272187238cdd9ad93684c0205dd237dfe1e83e4088c8\": rpc error: code = NotFound desc = could not find container \"53e0f5e632e6350bc8fe272187238cdd9ad93684c0205dd237dfe1e83e4088c8\": container with ID starting with 53e0f5e632e6350bc8fe272187238cdd9ad93684c0205dd237dfe1e83e4088c8 not found: ID does not exist" Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.218961 4704 scope.go:117] "RemoveContainer" containerID="3753e50ca5b6836d1569754164ebfba39f94ed2a061ff439f703ef34e18b77a1" Nov 25 15:59:39 crc kubenswrapper[4704]: E1125 15:59:39.219390 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3753e50ca5b6836d1569754164ebfba39f94ed2a061ff439f703ef34e18b77a1\": container with ID starting with 3753e50ca5b6836d1569754164ebfba39f94ed2a061ff439f703ef34e18b77a1 not found: ID does not exist" containerID="3753e50ca5b6836d1569754164ebfba39f94ed2a061ff439f703ef34e18b77a1" Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.219412 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3753e50ca5b6836d1569754164ebfba39f94ed2a061ff439f703ef34e18b77a1"} err="failed to get container status \"3753e50ca5b6836d1569754164ebfba39f94ed2a061ff439f703ef34e18b77a1\": rpc error: code = NotFound desc = could not find container \"3753e50ca5b6836d1569754164ebfba39f94ed2a061ff439f703ef34e18b77a1\": container with ID starting with 3753e50ca5b6836d1569754164ebfba39f94ed2a061ff439f703ef34e18b77a1 not found: ID does not exist" Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.219426 4704 scope.go:117] "RemoveContainer" containerID="d083ef3e0a40aa85473df44c192356a13c5685ae70d580460fe4f1d08c1bcbd5" Nov 25 15:59:39 crc kubenswrapper[4704]: E1125 15:59:39.219612 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d083ef3e0a40aa85473df44c192356a13c5685ae70d580460fe4f1d08c1bcbd5\": container with ID starting with d083ef3e0a40aa85473df44c192356a13c5685ae70d580460fe4f1d08c1bcbd5 not found: ID does not exist" containerID="d083ef3e0a40aa85473df44c192356a13c5685ae70d580460fe4f1d08c1bcbd5" Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.219639 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d083ef3e0a40aa85473df44c192356a13c5685ae70d580460fe4f1d08c1bcbd5"} err="failed to get container status \"d083ef3e0a40aa85473df44c192356a13c5685ae70d580460fe4f1d08c1bcbd5\": rpc error: code = NotFound desc = could not find container \"d083ef3e0a40aa85473df44c192356a13c5685ae70d580460fe4f1d08c1bcbd5\": container with ID starting with d083ef3e0a40aa85473df44c192356a13c5685ae70d580460fe4f1d08c1bcbd5 not found: ID does not exist" Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.219654 4704 scope.go:117] "RemoveContainer" containerID="e20673c20d7c0311239a889c402809af197e5e17642c1d0f8a07b0ed4b84a282" Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.241480 4704 scope.go:117] "RemoveContainer" containerID="24e50b7918df0089131f79d0e9d91c4fba43f92bb57ce19fb680b2083de2f86f" Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.265300 4704 scope.go:117] "RemoveContainer" containerID="4db0677b72edbbc5a0a2e79a23bb39f11137ef6493ce1c831c76760a7a95ba6e" Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.284781 4704 scope.go:117] "RemoveContainer" containerID="e20673c20d7c0311239a889c402809af197e5e17642c1d0f8a07b0ed4b84a282" Nov 25 15:59:39 crc kubenswrapper[4704]: E1125 15:59:39.285313 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e20673c20d7c0311239a889c402809af197e5e17642c1d0f8a07b0ed4b84a282\": container with ID starting with e20673c20d7c0311239a889c402809af197e5e17642c1d0f8a07b0ed4b84a282 not found: ID does not exist" containerID="e20673c20d7c0311239a889c402809af197e5e17642c1d0f8a07b0ed4b84a282" Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.285357 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e20673c20d7c0311239a889c402809af197e5e17642c1d0f8a07b0ed4b84a282"} err="failed to get container status \"e20673c20d7c0311239a889c402809af197e5e17642c1d0f8a07b0ed4b84a282\": rpc error: code = NotFound desc = could not find container \"e20673c20d7c0311239a889c402809af197e5e17642c1d0f8a07b0ed4b84a282\": container with ID starting with e20673c20d7c0311239a889c402809af197e5e17642c1d0f8a07b0ed4b84a282 not found: ID does not exist" Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.285385 4704 scope.go:117] "RemoveContainer" containerID="24e50b7918df0089131f79d0e9d91c4fba43f92bb57ce19fb680b2083de2f86f" Nov 25 15:59:39 crc kubenswrapper[4704]: E1125 15:59:39.285808 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24e50b7918df0089131f79d0e9d91c4fba43f92bb57ce19fb680b2083de2f86f\": container with ID starting with 24e50b7918df0089131f79d0e9d91c4fba43f92bb57ce19fb680b2083de2f86f not found: ID does not exist" containerID="24e50b7918df0089131f79d0e9d91c4fba43f92bb57ce19fb680b2083de2f86f" Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.285844 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24e50b7918df0089131f79d0e9d91c4fba43f92bb57ce19fb680b2083de2f86f"} err="failed to get container status \"24e50b7918df0089131f79d0e9d91c4fba43f92bb57ce19fb680b2083de2f86f\": rpc error: code = NotFound desc = could not find container \"24e50b7918df0089131f79d0e9d91c4fba43f92bb57ce19fb680b2083de2f86f\": container with ID starting with 24e50b7918df0089131f79d0e9d91c4fba43f92bb57ce19fb680b2083de2f86f not found: ID does not exist" Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.285866 4704 scope.go:117] "RemoveContainer" containerID="4db0677b72edbbc5a0a2e79a23bb39f11137ef6493ce1c831c76760a7a95ba6e" Nov 25 15:59:39 crc kubenswrapper[4704]: E1125 15:59:39.286657 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4db0677b72edbbc5a0a2e79a23bb39f11137ef6493ce1c831c76760a7a95ba6e\": container with ID starting with 4db0677b72edbbc5a0a2e79a23bb39f11137ef6493ce1c831c76760a7a95ba6e not found: ID does not exist" containerID="4db0677b72edbbc5a0a2e79a23bb39f11137ef6493ce1c831c76760a7a95ba6e" Nov 25 15:59:39 crc kubenswrapper[4704]: I1125 15:59:39.286709 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4db0677b72edbbc5a0a2e79a23bb39f11137ef6493ce1c831c76760a7a95ba6e"} err="failed to get container status \"4db0677b72edbbc5a0a2e79a23bb39f11137ef6493ce1c831c76760a7a95ba6e\": rpc error: code = NotFound desc = could not find container \"4db0677b72edbbc5a0a2e79a23bb39f11137ef6493ce1c831c76760a7a95ba6e\": container with ID starting with 4db0677b72edbbc5a0a2e79a23bb39f11137ef6493ce1c831c76760a7a95ba6e not found: ID does not exist" Nov 25 15:59:40 crc kubenswrapper[4704]: I1125 15:59:40.426003 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50f06623-1b90-4e01-bf7e-2ec9a076de51" path="/var/lib/kubelet/pods/50f06623-1b90-4e01-bf7e-2ec9a076de51/volumes" Nov 25 15:59:40 crc kubenswrapper[4704]: I1125 15:59:40.427290 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" path="/var/lib/kubelet/pods/5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1/volumes" Nov 25 15:59:54 crc kubenswrapper[4704]: I1125 15:59:54.278297 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-49h8n"] Nov 25 15:59:54 crc kubenswrapper[4704]: E1125 15:59:54.280828 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a58a0eb-db99-48d3-aa22-c9ff8fc874c8" containerName="extract-utilities" Nov 25 15:59:54 crc kubenswrapper[4704]: I1125 15:59:54.281141 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a58a0eb-db99-48d3-aa22-c9ff8fc874c8" containerName="extract-utilities" Nov 25 15:59:54 crc kubenswrapper[4704]: E1125 15:59:54.281263 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f06623-1b90-4e01-bf7e-2ec9a076de51" containerName="glance-log" Nov 25 15:59:54 crc kubenswrapper[4704]: I1125 15:59:54.281348 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f06623-1b90-4e01-bf7e-2ec9a076de51" containerName="glance-log" Nov 25 15:59:54 crc kubenswrapper[4704]: E1125 15:59:54.281441 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f06623-1b90-4e01-bf7e-2ec9a076de51" containerName="glance-httpd" Nov 25 15:59:54 crc kubenswrapper[4704]: I1125 15:59:54.281519 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f06623-1b90-4e01-bf7e-2ec9a076de51" containerName="glance-httpd" Nov 25 15:59:54 crc kubenswrapper[4704]: E1125 15:59:54.281614 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" containerName="glance-log" Nov 25 15:59:54 crc kubenswrapper[4704]: I1125 15:59:54.281689 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" containerName="glance-log" Nov 25 15:59:54 crc kubenswrapper[4704]: E1125 15:59:54.281838 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" containerName="glance-api" Nov 25 15:59:54 crc kubenswrapper[4704]: I1125 15:59:54.281929 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" containerName="glance-api" Nov 25 15:59:54 crc kubenswrapper[4704]: E1125 15:59:54.282009 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a58a0eb-db99-48d3-aa22-c9ff8fc874c8" containerName="registry-server" Nov 25 15:59:54 crc kubenswrapper[4704]: I1125 15:59:54.282087 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a58a0eb-db99-48d3-aa22-c9ff8fc874c8" containerName="registry-server" Nov 25 15:59:54 crc kubenswrapper[4704]: E1125 15:59:54.282169 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f06623-1b90-4e01-bf7e-2ec9a076de51" containerName="glance-api" Nov 25 15:59:54 crc kubenswrapper[4704]: I1125 15:59:54.282241 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f06623-1b90-4e01-bf7e-2ec9a076de51" containerName="glance-api" Nov 25 15:59:54 crc kubenswrapper[4704]: E1125 15:59:54.282320 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a58a0eb-db99-48d3-aa22-c9ff8fc874c8" containerName="extract-content" Nov 25 15:59:54 crc kubenswrapper[4704]: I1125 15:59:54.282391 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a58a0eb-db99-48d3-aa22-c9ff8fc874c8" containerName="extract-content" Nov 25 15:59:54 crc kubenswrapper[4704]: E1125 15:59:54.282469 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" containerName="glance-httpd" Nov 25 15:59:54 crc kubenswrapper[4704]: I1125 15:59:54.282598 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" containerName="glance-httpd" Nov 25 15:59:54 crc kubenswrapper[4704]: I1125 15:59:54.282934 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" containerName="glance-log" Nov 25 15:59:54 crc kubenswrapper[4704]: I1125 15:59:54.283027 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="50f06623-1b90-4e01-bf7e-2ec9a076de51" containerName="glance-httpd" Nov 25 15:59:54 crc kubenswrapper[4704]: I1125 15:59:54.283109 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="50f06623-1b90-4e01-bf7e-2ec9a076de51" containerName="glance-api" Nov 25 15:59:54 crc kubenswrapper[4704]: I1125 15:59:54.283195 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" containerName="glance-httpd" Nov 25 15:59:54 crc kubenswrapper[4704]: I1125 15:59:54.283274 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="5740db0f-d924-4c99-a5f8-a5b6fcf6d2d1" containerName="glance-api" Nov 25 15:59:54 crc kubenswrapper[4704]: I1125 15:59:54.283351 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="50f06623-1b90-4e01-bf7e-2ec9a076de51" containerName="glance-log" Nov 25 15:59:54 crc kubenswrapper[4704]: I1125 15:59:54.283425 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a58a0eb-db99-48d3-aa22-c9ff8fc874c8" containerName="registry-server" Nov 25 15:59:54 crc kubenswrapper[4704]: I1125 15:59:54.291076 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-49h8n" Nov 25 15:59:54 crc kubenswrapper[4704]: I1125 15:59:54.295638 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-49h8n"] Nov 25 15:59:54 crc kubenswrapper[4704]: I1125 15:59:54.424859 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11c7083d-4abd-4a07-99e0-dd218f9a145b-catalog-content\") pod \"certified-operators-49h8n\" (UID: \"11c7083d-4abd-4a07-99e0-dd218f9a145b\") " pod="openshift-marketplace/certified-operators-49h8n" Nov 25 15:59:54 crc kubenswrapper[4704]: I1125 15:59:54.424918 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11c7083d-4abd-4a07-99e0-dd218f9a145b-utilities\") pod \"certified-operators-49h8n\" (UID: \"11c7083d-4abd-4a07-99e0-dd218f9a145b\") " pod="openshift-marketplace/certified-operators-49h8n" Nov 25 15:59:54 crc kubenswrapper[4704]: I1125 15:59:54.424964 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k66rh\" (UniqueName: \"kubernetes.io/projected/11c7083d-4abd-4a07-99e0-dd218f9a145b-kube-api-access-k66rh\") pod \"certified-operators-49h8n\" (UID: \"11c7083d-4abd-4a07-99e0-dd218f9a145b\") " pod="openshift-marketplace/certified-operators-49h8n" Nov 25 15:59:54 crc kubenswrapper[4704]: I1125 15:59:54.525644 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11c7083d-4abd-4a07-99e0-dd218f9a145b-catalog-content\") pod \"certified-operators-49h8n\" (UID: \"11c7083d-4abd-4a07-99e0-dd218f9a145b\") " pod="openshift-marketplace/certified-operators-49h8n" Nov 25 15:59:54 crc kubenswrapper[4704]: I1125 15:59:54.525766 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11c7083d-4abd-4a07-99e0-dd218f9a145b-utilities\") pod \"certified-operators-49h8n\" (UID: \"11c7083d-4abd-4a07-99e0-dd218f9a145b\") " pod="openshift-marketplace/certified-operators-49h8n" Nov 25 15:59:54 crc kubenswrapper[4704]: I1125 15:59:54.526492 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11c7083d-4abd-4a07-99e0-dd218f9a145b-catalog-content\") pod \"certified-operators-49h8n\" (UID: \"11c7083d-4abd-4a07-99e0-dd218f9a145b\") " pod="openshift-marketplace/certified-operators-49h8n" Nov 25 15:59:54 crc kubenswrapper[4704]: I1125 15:59:54.526545 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11c7083d-4abd-4a07-99e0-dd218f9a145b-utilities\") pod \"certified-operators-49h8n\" (UID: \"11c7083d-4abd-4a07-99e0-dd218f9a145b\") " pod="openshift-marketplace/certified-operators-49h8n" Nov 25 15:59:54 crc kubenswrapper[4704]: I1125 15:59:54.526638 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k66rh\" (UniqueName: \"kubernetes.io/projected/11c7083d-4abd-4a07-99e0-dd218f9a145b-kube-api-access-k66rh\") pod \"certified-operators-49h8n\" (UID: \"11c7083d-4abd-4a07-99e0-dd218f9a145b\") " pod="openshift-marketplace/certified-operators-49h8n" Nov 25 15:59:54 crc kubenswrapper[4704]: I1125 15:59:54.547690 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k66rh\" (UniqueName: \"kubernetes.io/projected/11c7083d-4abd-4a07-99e0-dd218f9a145b-kube-api-access-k66rh\") pod \"certified-operators-49h8n\" (UID: \"11c7083d-4abd-4a07-99e0-dd218f9a145b\") " pod="openshift-marketplace/certified-operators-49h8n" Nov 25 15:59:54 crc kubenswrapper[4704]: I1125 15:59:54.620200 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-49h8n" Nov 25 15:59:55 crc kubenswrapper[4704]: I1125 15:59:55.112892 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-49h8n"] Nov 25 15:59:55 crc kubenswrapper[4704]: I1125 15:59:55.258464 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49h8n" event={"ID":"11c7083d-4abd-4a07-99e0-dd218f9a145b","Type":"ContainerStarted","Data":"fb0060621f2a7e0702f814cc7c763c3e9264de13b191c487a4758b5b032673fb"} Nov 25 15:59:56 crc kubenswrapper[4704]: I1125 15:59:56.267778 4704 generic.go:334] "Generic (PLEG): container finished" podID="11c7083d-4abd-4a07-99e0-dd218f9a145b" containerID="981484de944a4f30a774f10d144d9fd7102211b29a1acf133c1228752f950a24" exitCode=0 Nov 25 15:59:56 crc kubenswrapper[4704]: I1125 15:59:56.267975 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49h8n" event={"ID":"11c7083d-4abd-4a07-99e0-dd218f9a145b","Type":"ContainerDied","Data":"981484de944a4f30a774f10d144d9fd7102211b29a1acf133c1228752f950a24"} Nov 25 15:59:58 crc kubenswrapper[4704]: I1125 15:59:58.284841 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49h8n" event={"ID":"11c7083d-4abd-4a07-99e0-dd218f9a145b","Type":"ContainerStarted","Data":"0bcd485d50f4f71d8baa6e77d2596ed0aea41601937538485905644798e31141"} Nov 25 15:59:59 crc kubenswrapper[4704]: I1125 15:59:59.296420 4704 generic.go:334] "Generic (PLEG): container finished" podID="11c7083d-4abd-4a07-99e0-dd218f9a145b" containerID="0bcd485d50f4f71d8baa6e77d2596ed0aea41601937538485905644798e31141" exitCode=0 Nov 25 15:59:59 crc kubenswrapper[4704]: I1125 15:59:59.296486 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49h8n" event={"ID":"11c7083d-4abd-4a07-99e0-dd218f9a145b","Type":"ContainerDied","Data":"0bcd485d50f4f71d8baa6e77d2596ed0aea41601937538485905644798e31141"} Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.149234 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg"] Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.151366 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.154503 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz"] Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.156772 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.163534 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg"] Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.173111 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz"] Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.244770 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401440-7xsgk"] Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.246045 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-7xsgk" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.250780 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.251836 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.252770 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401440-7xsgk"] Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.313484 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/d79d8223-8a36-4af2-84ac-ce2d4c024eb4-image-cache-config-data\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg\" (UID: \"d79d8223-8a36-4af2-84ac-ce2d4c024eb4\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.313544 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/45f5c1d5-2ab3-46ef-89f4-2447584718e2-image-cache-config-data\") pod \"glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz\" (UID: \"45f5c1d5-2ab3-46ef-89f4-2447584718e2\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.313574 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qh8l\" (UniqueName: \"kubernetes.io/projected/45f5c1d5-2ab3-46ef-89f4-2447584718e2-kube-api-access-5qh8l\") pod \"glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz\" (UID: \"45f5c1d5-2ab3-46ef-89f4-2447584718e2\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.313600 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg\" (UID: \"d79d8223-8a36-4af2-84ac-ce2d4c024eb4\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.313631 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz\" (UID: \"45f5c1d5-2ab3-46ef-89f4-2447584718e2\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.313684 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-652lw\" (UniqueName: \"kubernetes.io/projected/d79d8223-8a36-4af2-84ac-ce2d4c024eb4-kube-api-access-652lw\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg\" (UID: \"d79d8223-8a36-4af2-84ac-ce2d4c024eb4\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.323407 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49h8n" event={"ID":"11c7083d-4abd-4a07-99e0-dd218f9a145b","Type":"ContainerStarted","Data":"7e9c8cec97ba3d417e4bf74374b2fa06bcc59044e6c721e9db14784190d44b26"} Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.350829 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-49h8n" podStartSLOduration=2.926104631 podStartE2EDuration="6.350780947s" podCreationTimestamp="2025-11-25 15:59:54 +0000 UTC" firstStartedPulling="2025-11-25 15:59:56.269738124 +0000 UTC m=+1482.538011905" lastFinishedPulling="2025-11-25 15:59:59.69441445 +0000 UTC m=+1485.962688221" observedRunningTime="2025-11-25 16:00:00.346348319 +0000 UTC m=+1486.614622100" watchObservedRunningTime="2025-11-25 16:00:00.350780947 +0000 UTC m=+1486.619054728" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.353866 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz\" (UID: \"45f5c1d5-2ab3-46ef-89f4-2447584718e2\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.365254 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg\" (UID: \"d79d8223-8a36-4af2-84ac-ce2d4c024eb4\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.414548 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-652lw\" (UniqueName: \"kubernetes.io/projected/d79d8223-8a36-4af2-84ac-ce2d4c024eb4-kube-api-access-652lw\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg\" (UID: \"d79d8223-8a36-4af2-84ac-ce2d4c024eb4\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.414613 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55-secret-volume\") pod \"collect-profiles-29401440-7xsgk\" (UID: \"fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-7xsgk" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.414676 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/d79d8223-8a36-4af2-84ac-ce2d4c024eb4-image-cache-config-data\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg\" (UID: \"d79d8223-8a36-4af2-84ac-ce2d4c024eb4\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.414707 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/45f5c1d5-2ab3-46ef-89f4-2447584718e2-image-cache-config-data\") pod \"glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz\" (UID: \"45f5c1d5-2ab3-46ef-89f4-2447584718e2\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.414737 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfksr\" (UniqueName: \"kubernetes.io/projected/fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55-kube-api-access-mfksr\") pod \"collect-profiles-29401440-7xsgk\" (UID: \"fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-7xsgk" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.414765 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qh8l\" (UniqueName: \"kubernetes.io/projected/45f5c1d5-2ab3-46ef-89f4-2447584718e2-kube-api-access-5qh8l\") pod \"glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz\" (UID: \"45f5c1d5-2ab3-46ef-89f4-2447584718e2\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.414823 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55-config-volume\") pod \"collect-profiles-29401440-7xsgk\" (UID: \"fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-7xsgk" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.420650 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/45f5c1d5-2ab3-46ef-89f4-2447584718e2-image-cache-config-data\") pod \"glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz\" (UID: \"45f5c1d5-2ab3-46ef-89f4-2447584718e2\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.420649 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/d79d8223-8a36-4af2-84ac-ce2d4c024eb4-image-cache-config-data\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg\" (UID: \"d79d8223-8a36-4af2-84ac-ce2d4c024eb4\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.434390 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qh8l\" (UniqueName: \"kubernetes.io/projected/45f5c1d5-2ab3-46ef-89f4-2447584718e2-kube-api-access-5qh8l\") pod \"glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz\" (UID: \"45f5c1d5-2ab3-46ef-89f4-2447584718e2\") " pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.435171 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-652lw\" (UniqueName: \"kubernetes.io/projected/d79d8223-8a36-4af2-84ac-ce2d4c024eb4-kube-api-access-652lw\") pod \"glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg\" (UID: \"d79d8223-8a36-4af2-84ac-ce2d4c024eb4\") " pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.472063 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.486166 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.515883 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55-config-volume\") pod \"collect-profiles-29401440-7xsgk\" (UID: \"fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-7xsgk" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.515956 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55-secret-volume\") pod \"collect-profiles-29401440-7xsgk\" (UID: \"fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-7xsgk" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.516005 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfksr\" (UniqueName: \"kubernetes.io/projected/fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55-kube-api-access-mfksr\") pod \"collect-profiles-29401440-7xsgk\" (UID: \"fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-7xsgk" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.517480 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55-config-volume\") pod \"collect-profiles-29401440-7xsgk\" (UID: \"fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-7xsgk" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.521018 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55-secret-volume\") pod \"collect-profiles-29401440-7xsgk\" (UID: \"fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-7xsgk" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.531898 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfksr\" (UniqueName: \"kubernetes.io/projected/fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55-kube-api-access-mfksr\") pod \"collect-profiles-29401440-7xsgk\" (UID: \"fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-7xsgk" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.564266 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-7xsgk" Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.991003 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz"] Nov 25 16:00:00 crc kubenswrapper[4704]: I1125 16:00:00.995970 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg"] Nov 25 16:00:01 crc kubenswrapper[4704]: W1125 16:00:01.001395 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd79d8223_8a36_4af2_84ac_ce2d4c024eb4.slice/crio-19eb6c48f03a48105b597c9d550e64dcb777f8c31ea7cfe6471bcab976564e0f WatchSource:0}: Error finding container 19eb6c48f03a48105b597c9d550e64dcb777f8c31ea7cfe6471bcab976564e0f: Status 404 returned error can't find the container with id 19eb6c48f03a48105b597c9d550e64dcb777f8c31ea7cfe6471bcab976564e0f Nov 25 16:00:01 crc kubenswrapper[4704]: W1125 16:00:01.002952 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45f5c1d5_2ab3_46ef_89f4_2447584718e2.slice/crio-658822c54a05d4e94550c96dcabba89403ca3c2bdf5240c2f4b6c0e87ba1db5a WatchSource:0}: Error finding container 658822c54a05d4e94550c96dcabba89403ca3c2bdf5240c2f4b6c0e87ba1db5a: Status 404 returned error can't find the container with id 658822c54a05d4e94550c96dcabba89403ca3c2bdf5240c2f4b6c0e87ba1db5a Nov 25 16:00:01 crc kubenswrapper[4704]: I1125 16:00:01.151996 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401440-7xsgk"] Nov 25 16:00:01 crc kubenswrapper[4704]: I1125 16:00:01.336031 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-7xsgk" event={"ID":"fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55","Type":"ContainerStarted","Data":"1be5c35a00cd233772073b6357b46fb9b5e694c70e7ab80015bde2dfb302ff0d"} Nov 25 16:00:01 crc kubenswrapper[4704]: I1125 16:00:01.338347 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz" event={"ID":"45f5c1d5-2ab3-46ef-89f4-2447584718e2","Type":"ContainerStarted","Data":"658822c54a05d4e94550c96dcabba89403ca3c2bdf5240c2f4b6c0e87ba1db5a"} Nov 25 16:00:01 crc kubenswrapper[4704]: I1125 16:00:01.339872 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg" event={"ID":"d79d8223-8a36-4af2-84ac-ce2d4c024eb4","Type":"ContainerStarted","Data":"19eb6c48f03a48105b597c9d550e64dcb777f8c31ea7cfe6471bcab976564e0f"} Nov 25 16:00:02 crc kubenswrapper[4704]: I1125 16:00:02.351870 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg" event={"ID":"d79d8223-8a36-4af2-84ac-ce2d4c024eb4","Type":"ContainerStarted","Data":"98389b4dee9c0d9a0c12f328d9bcc3ccebdc341cc1873f0b5cd310cc29b2ecd0"} Nov 25 16:00:02 crc kubenswrapper[4704]: I1125 16:00:02.354805 4704 generic.go:334] "Generic (PLEG): container finished" podID="fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55" containerID="b3f73ec47f0c38ad2321331f8406106a26965de700ff110d21c4ba411fd1fa8c" exitCode=0 Nov 25 16:00:02 crc kubenswrapper[4704]: I1125 16:00:02.354910 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-7xsgk" event={"ID":"fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55","Type":"ContainerDied","Data":"b3f73ec47f0c38ad2321331f8406106a26965de700ff110d21c4ba411fd1fa8c"} Nov 25 16:00:02 crc kubenswrapper[4704]: I1125 16:00:02.357393 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz" event={"ID":"45f5c1d5-2ab3-46ef-89f4-2447584718e2","Type":"ContainerStarted","Data":"51b9520d80823ddde7c03722665803d37af915c16cbd1be2a90c02c02766515a"} Nov 25 16:00:02 crc kubenswrapper[4704]: I1125 16:00:02.369945 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg" podStartSLOduration=2.369921011 podStartE2EDuration="2.369921011s" podCreationTimestamp="2025-11-25 16:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:00:02.367181952 +0000 UTC m=+1488.635455733" watchObservedRunningTime="2025-11-25 16:00:02.369921011 +0000 UTC m=+1488.638194812" Nov 25 16:00:02 crc kubenswrapper[4704]: I1125 16:00:02.404335 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz" podStartSLOduration=2.404312866 podStartE2EDuration="2.404312866s" podCreationTimestamp="2025-11-25 16:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:00:02.402835693 +0000 UTC m=+1488.671109474" watchObservedRunningTime="2025-11-25 16:00:02.404312866 +0000 UTC m=+1488.672586647" Nov 25 16:00:02 crc kubenswrapper[4704]: I1125 16:00:02.557161 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qwsvr"] Nov 25 16:00:02 crc kubenswrapper[4704]: I1125 16:00:02.559102 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qwsvr" Nov 25 16:00:02 crc kubenswrapper[4704]: I1125 16:00:02.576195 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qwsvr"] Nov 25 16:00:02 crc kubenswrapper[4704]: I1125 16:00:02.663978 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0b38583-f22a-45dc-b1f6-1fb67f046f2d-utilities\") pod \"redhat-marketplace-qwsvr\" (UID: \"b0b38583-f22a-45dc-b1f6-1fb67f046f2d\") " pod="openshift-marketplace/redhat-marketplace-qwsvr" Nov 25 16:00:02 crc kubenswrapper[4704]: I1125 16:00:02.664107 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntxv8\" (UniqueName: \"kubernetes.io/projected/b0b38583-f22a-45dc-b1f6-1fb67f046f2d-kube-api-access-ntxv8\") pod \"redhat-marketplace-qwsvr\" (UID: \"b0b38583-f22a-45dc-b1f6-1fb67f046f2d\") " pod="openshift-marketplace/redhat-marketplace-qwsvr" Nov 25 16:00:02 crc kubenswrapper[4704]: I1125 16:00:02.664137 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0b38583-f22a-45dc-b1f6-1fb67f046f2d-catalog-content\") pod \"redhat-marketplace-qwsvr\" (UID: \"b0b38583-f22a-45dc-b1f6-1fb67f046f2d\") " pod="openshift-marketplace/redhat-marketplace-qwsvr" Nov 25 16:00:02 crc kubenswrapper[4704]: I1125 16:00:02.766062 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0b38583-f22a-45dc-b1f6-1fb67f046f2d-utilities\") pod \"redhat-marketplace-qwsvr\" (UID: \"b0b38583-f22a-45dc-b1f6-1fb67f046f2d\") " pod="openshift-marketplace/redhat-marketplace-qwsvr" Nov 25 16:00:02 crc kubenswrapper[4704]: I1125 16:00:02.766229 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntxv8\" (UniqueName: \"kubernetes.io/projected/b0b38583-f22a-45dc-b1f6-1fb67f046f2d-kube-api-access-ntxv8\") pod \"redhat-marketplace-qwsvr\" (UID: \"b0b38583-f22a-45dc-b1f6-1fb67f046f2d\") " pod="openshift-marketplace/redhat-marketplace-qwsvr" Nov 25 16:00:02 crc kubenswrapper[4704]: I1125 16:00:02.766263 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0b38583-f22a-45dc-b1f6-1fb67f046f2d-catalog-content\") pod \"redhat-marketplace-qwsvr\" (UID: \"b0b38583-f22a-45dc-b1f6-1fb67f046f2d\") " pod="openshift-marketplace/redhat-marketplace-qwsvr" Nov 25 16:00:02 crc kubenswrapper[4704]: I1125 16:00:02.767816 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0b38583-f22a-45dc-b1f6-1fb67f046f2d-catalog-content\") pod \"redhat-marketplace-qwsvr\" (UID: \"b0b38583-f22a-45dc-b1f6-1fb67f046f2d\") " pod="openshift-marketplace/redhat-marketplace-qwsvr" Nov 25 16:00:02 crc kubenswrapper[4704]: I1125 16:00:02.767939 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0b38583-f22a-45dc-b1f6-1fb67f046f2d-utilities\") pod \"redhat-marketplace-qwsvr\" (UID: \"b0b38583-f22a-45dc-b1f6-1fb67f046f2d\") " pod="openshift-marketplace/redhat-marketplace-qwsvr" Nov 25 16:00:02 crc kubenswrapper[4704]: I1125 16:00:02.792186 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntxv8\" (UniqueName: \"kubernetes.io/projected/b0b38583-f22a-45dc-b1f6-1fb67f046f2d-kube-api-access-ntxv8\") pod \"redhat-marketplace-qwsvr\" (UID: \"b0b38583-f22a-45dc-b1f6-1fb67f046f2d\") " pod="openshift-marketplace/redhat-marketplace-qwsvr" Nov 25 16:00:02 crc kubenswrapper[4704]: I1125 16:00:02.893864 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qwsvr" Nov 25 16:00:03 crc kubenswrapper[4704]: I1125 16:00:03.368364 4704 generic.go:334] "Generic (PLEG): container finished" podID="45f5c1d5-2ab3-46ef-89f4-2447584718e2" containerID="51b9520d80823ddde7c03722665803d37af915c16cbd1be2a90c02c02766515a" exitCode=0 Nov 25 16:00:03 crc kubenswrapper[4704]: I1125 16:00:03.368447 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz" event={"ID":"45f5c1d5-2ab3-46ef-89f4-2447584718e2","Type":"ContainerDied","Data":"51b9520d80823ddde7c03722665803d37af915c16cbd1be2a90c02c02766515a"} Nov 25 16:00:03 crc kubenswrapper[4704]: I1125 16:00:03.370417 4704 generic.go:334] "Generic (PLEG): container finished" podID="d79d8223-8a36-4af2-84ac-ce2d4c024eb4" containerID="98389b4dee9c0d9a0c12f328d9bcc3ccebdc341cc1873f0b5cd310cc29b2ecd0" exitCode=0 Nov 25 16:00:03 crc kubenswrapper[4704]: I1125 16:00:03.370801 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg" event={"ID":"d79d8223-8a36-4af2-84ac-ce2d4c024eb4","Type":"ContainerDied","Data":"98389b4dee9c0d9a0c12f328d9bcc3ccebdc341cc1873f0b5cd310cc29b2ecd0"} Nov 25 16:00:03 crc kubenswrapper[4704]: I1125 16:00:03.430842 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qwsvr"] Nov 25 16:00:03 crc kubenswrapper[4704]: I1125 16:00:03.589726 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-7xsgk" Nov 25 16:00:03 crc kubenswrapper[4704]: I1125 16:00:03.682351 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55-secret-volume\") pod \"fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55\" (UID: \"fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55\") " Nov 25 16:00:03 crc kubenswrapper[4704]: I1125 16:00:03.682452 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55-config-volume\") pod \"fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55\" (UID: \"fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55\") " Nov 25 16:00:03 crc kubenswrapper[4704]: I1125 16:00:03.682537 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfksr\" (UniqueName: \"kubernetes.io/projected/fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55-kube-api-access-mfksr\") pod \"fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55\" (UID: \"fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55\") " Nov 25 16:00:03 crc kubenswrapper[4704]: I1125 16:00:03.683510 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55-config-volume" (OuterVolumeSpecName: "config-volume") pod "fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55" (UID: "fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:00:03 crc kubenswrapper[4704]: I1125 16:00:03.689753 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55" (UID: "fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:00:03 crc kubenswrapper[4704]: I1125 16:00:03.689819 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55-kube-api-access-mfksr" (OuterVolumeSpecName: "kube-api-access-mfksr") pod "fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55" (UID: "fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55"). InnerVolumeSpecName "kube-api-access-mfksr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:00:03 crc kubenswrapper[4704]: I1125 16:00:03.783877 4704 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 25 16:00:03 crc kubenswrapper[4704]: I1125 16:00:03.783914 4704 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 16:00:03 crc kubenswrapper[4704]: I1125 16:00:03.783924 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfksr\" (UniqueName: \"kubernetes.io/projected/fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55-kube-api-access-mfksr\") on node \"crc\" DevicePath \"\"" Nov 25 16:00:04 crc kubenswrapper[4704]: I1125 16:00:04.381583 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-7xsgk" event={"ID":"fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55","Type":"ContainerDied","Data":"1be5c35a00cd233772073b6357b46fb9b5e694c70e7ab80015bde2dfb302ff0d"} Nov 25 16:00:04 crc kubenswrapper[4704]: I1125 16:00:04.382350 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1be5c35a00cd233772073b6357b46fb9b5e694c70e7ab80015bde2dfb302ff0d" Nov 25 16:00:04 crc kubenswrapper[4704]: I1125 16:00:04.381600 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401440-7xsgk" Nov 25 16:00:04 crc kubenswrapper[4704]: I1125 16:00:04.384400 4704 generic.go:334] "Generic (PLEG): container finished" podID="b0b38583-f22a-45dc-b1f6-1fb67f046f2d" containerID="13679ea0bdf964fb13f2819ad6203833b1f079e22c3c3567c3c504c7ee7eb226" exitCode=0 Nov 25 16:00:04 crc kubenswrapper[4704]: I1125 16:00:04.384440 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qwsvr" event={"ID":"b0b38583-f22a-45dc-b1f6-1fb67f046f2d","Type":"ContainerDied","Data":"13679ea0bdf964fb13f2819ad6203833b1f079e22c3c3567c3c504c7ee7eb226"} Nov 25 16:00:04 crc kubenswrapper[4704]: I1125 16:00:04.384484 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qwsvr" event={"ID":"b0b38583-f22a-45dc-b1f6-1fb67f046f2d","Type":"ContainerStarted","Data":"3db84b2b8c81e685d646935003fa60dfbd268ad5822e97cb7c2b70c55c8b9d4d"} Nov 25 16:00:04 crc kubenswrapper[4704]: I1125 16:00:04.621345 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-49h8n" Nov 25 16:00:04 crc kubenswrapper[4704]: I1125 16:00:04.621381 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-49h8n" Nov 25 16:00:04 crc kubenswrapper[4704]: I1125 16:00:04.679540 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-49h8n" Nov 25 16:00:04 crc kubenswrapper[4704]: I1125 16:00:04.745711 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz" Nov 25 16:00:04 crc kubenswrapper[4704]: I1125 16:00:04.751388 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg" Nov 25 16:00:04 crc kubenswrapper[4704]: I1125 16:00:04.904352 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"d79d8223-8a36-4af2-84ac-ce2d4c024eb4\" (UID: \"d79d8223-8a36-4af2-84ac-ce2d4c024eb4\") " Nov 25 16:00:04 crc kubenswrapper[4704]: I1125 16:00:04.904878 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/d79d8223-8a36-4af2-84ac-ce2d4c024eb4-image-cache-config-data\") pod \"d79d8223-8a36-4af2-84ac-ce2d4c024eb4\" (UID: \"d79d8223-8a36-4af2-84ac-ce2d4c024eb4\") " Nov 25 16:00:04 crc kubenswrapper[4704]: I1125 16:00:04.904913 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"45f5c1d5-2ab3-46ef-89f4-2447584718e2\" (UID: \"45f5c1d5-2ab3-46ef-89f4-2447584718e2\") " Nov 25 16:00:04 crc kubenswrapper[4704]: I1125 16:00:04.904952 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qh8l\" (UniqueName: \"kubernetes.io/projected/45f5c1d5-2ab3-46ef-89f4-2447584718e2-kube-api-access-5qh8l\") pod \"45f5c1d5-2ab3-46ef-89f4-2447584718e2\" (UID: \"45f5c1d5-2ab3-46ef-89f4-2447584718e2\") " Nov 25 16:00:04 crc kubenswrapper[4704]: I1125 16:00:04.905018 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-652lw\" (UniqueName: \"kubernetes.io/projected/d79d8223-8a36-4af2-84ac-ce2d4c024eb4-kube-api-access-652lw\") pod \"d79d8223-8a36-4af2-84ac-ce2d4c024eb4\" (UID: \"d79d8223-8a36-4af2-84ac-ce2d4c024eb4\") " Nov 25 16:00:04 crc kubenswrapper[4704]: I1125 16:00:04.905052 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/45f5c1d5-2ab3-46ef-89f4-2447584718e2-image-cache-config-data\") pod \"45f5c1d5-2ab3-46ef-89f4-2447584718e2\" (UID: \"45f5c1d5-2ab3-46ef-89f4-2447584718e2\") " Nov 25 16:00:04 crc kubenswrapper[4704]: I1125 16:00:04.910759 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance-cache") pod "d79d8223-8a36-4af2-84ac-ce2d4c024eb4" (UID: "d79d8223-8a36-4af2-84ac-ce2d4c024eb4"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 16:00:04 crc kubenswrapper[4704]: I1125 16:00:04.911004 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d79d8223-8a36-4af2-84ac-ce2d4c024eb4-image-cache-config-data" (OuterVolumeSpecName: "image-cache-config-data") pod "d79d8223-8a36-4af2-84ac-ce2d4c024eb4" (UID: "d79d8223-8a36-4af2-84ac-ce2d4c024eb4"). InnerVolumeSpecName "image-cache-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:00:04 crc kubenswrapper[4704]: I1125 16:00:04.911512 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45f5c1d5-2ab3-46ef-89f4-2447584718e2-kube-api-access-5qh8l" (OuterVolumeSpecName: "kube-api-access-5qh8l") pod "45f5c1d5-2ab3-46ef-89f4-2447584718e2" (UID: "45f5c1d5-2ab3-46ef-89f4-2447584718e2"). InnerVolumeSpecName "kube-api-access-5qh8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:00:04 crc kubenswrapper[4704]: I1125 16:00:04.911529 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance-cache") pod "45f5c1d5-2ab3-46ef-89f4-2447584718e2" (UID: "45f5c1d5-2ab3-46ef-89f4-2447584718e2"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 16:00:04 crc kubenswrapper[4704]: I1125 16:00:04.912005 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d79d8223-8a36-4af2-84ac-ce2d4c024eb4-kube-api-access-652lw" (OuterVolumeSpecName: "kube-api-access-652lw") pod "d79d8223-8a36-4af2-84ac-ce2d4c024eb4" (UID: "d79d8223-8a36-4af2-84ac-ce2d4c024eb4"). InnerVolumeSpecName "kube-api-access-652lw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:00:04 crc kubenswrapper[4704]: I1125 16:00:04.912818 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45f5c1d5-2ab3-46ef-89f4-2447584718e2-image-cache-config-data" (OuterVolumeSpecName: "image-cache-config-data") pod "45f5c1d5-2ab3-46ef-89f4-2447584718e2" (UID: "45f5c1d5-2ab3-46ef-89f4-2447584718e2"). InnerVolumeSpecName "image-cache-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.009995 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-652lw\" (UniqueName: \"kubernetes.io/projected/d79d8223-8a36-4af2-84ac-ce2d4c024eb4-kube-api-access-652lw\") on node \"crc\" DevicePath \"\"" Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.010044 4704 reconciler_common.go:293] "Volume detached for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/45f5c1d5-2ab3-46ef-89f4-2447584718e2-image-cache-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.010056 4704 reconciler_common.go:293] "Volume detached for volume \"image-cache-config-data\" (UniqueName: \"kubernetes.io/secret/d79d8223-8a36-4af2-84ac-ce2d4c024eb4-image-cache-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.010070 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qh8l\" (UniqueName: \"kubernetes.io/projected/45f5c1d5-2ab3-46ef-89f4-2447584718e2-kube-api-access-5qh8l\") on node \"crc\" DevicePath \"\"" Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.155263 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7lp6r"] Nov 25 16:00:05 crc kubenswrapper[4704]: E1125 16:00:05.155771 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55" containerName="collect-profiles" Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.155874 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55" containerName="collect-profiles" Nov 25 16:00:05 crc kubenswrapper[4704]: E1125 16:00:05.155894 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45f5c1d5-2ab3-46ef-89f4-2447584718e2" containerName="glance-cache-glance-default-external-api-0-cleaner" Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.155902 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="45f5c1d5-2ab3-46ef-89f4-2447584718e2" containerName="glance-cache-glance-default-external-api-0-cleaner" Nov 25 16:00:05 crc kubenswrapper[4704]: E1125 16:00:05.155912 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d79d8223-8a36-4af2-84ac-ce2d4c024eb4" containerName="glance-cache-glance-default-internal-api-0-cleaner" Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.155919 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="d79d8223-8a36-4af2-84ac-ce2d4c024eb4" containerName="glance-cache-glance-default-internal-api-0-cleaner" Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.156098 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="d79d8223-8a36-4af2-84ac-ce2d4c024eb4" containerName="glance-cache-glance-default-internal-api-0-cleaner" Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.156112 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="45f5c1d5-2ab3-46ef-89f4-2447584718e2" containerName="glance-cache-glance-default-external-api-0-cleaner" Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.156128 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd2ca3ee-fc1e-4937-bea8-ccb2cdcfef55" containerName="collect-profiles" Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.157483 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7lp6r" Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.163446 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7lp6r"] Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.314088 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48a7441f-b4b6-44e1-9b00-a34e43c4986a-catalog-content\") pod \"redhat-operators-7lp6r\" (UID: \"48a7441f-b4b6-44e1-9b00-a34e43c4986a\") " pod="openshift-marketplace/redhat-operators-7lp6r" Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.314136 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nht5d\" (UniqueName: \"kubernetes.io/projected/48a7441f-b4b6-44e1-9b00-a34e43c4986a-kube-api-access-nht5d\") pod \"redhat-operators-7lp6r\" (UID: \"48a7441f-b4b6-44e1-9b00-a34e43c4986a\") " pod="openshift-marketplace/redhat-operators-7lp6r" Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.314339 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48a7441f-b4b6-44e1-9b00-a34e43c4986a-utilities\") pod \"redhat-operators-7lp6r\" (UID: \"48a7441f-b4b6-44e1-9b00-a34e43c4986a\") " pod="openshift-marketplace/redhat-operators-7lp6r" Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.394426 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz" event={"ID":"45f5c1d5-2ab3-46ef-89f4-2447584718e2","Type":"ContainerDied","Data":"658822c54a05d4e94550c96dcabba89403ca3c2bdf5240c2f4b6c0e87ba1db5a"} Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.394579 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="658822c54a05d4e94550c96dcabba89403ca3c2bdf5240c2f4b6c0e87ba1db5a" Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.394446 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz" Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.396490 4704 generic.go:334] "Generic (PLEG): container finished" podID="b0b38583-f22a-45dc-b1f6-1fb67f046f2d" containerID="cfbea08ee5a4fcac5c97dd9db771327534d105b5563f120be6c08b22d8860031" exitCode=0 Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.396550 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qwsvr" event={"ID":"b0b38583-f22a-45dc-b1f6-1fb67f046f2d","Type":"ContainerDied","Data":"cfbea08ee5a4fcac5c97dd9db771327534d105b5563f120be6c08b22d8860031"} Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.406210 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg" event={"ID":"d79d8223-8a36-4af2-84ac-ce2d4c024eb4","Type":"ContainerDied","Data":"19eb6c48f03a48105b597c9d550e64dcb777f8c31ea7cfe6471bcab976564e0f"} Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.406939 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19eb6c48f03a48105b597c9d550e64dcb777f8c31ea7cfe6471bcab976564e0f" Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.407035 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg" Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.416236 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48a7441f-b4b6-44e1-9b00-a34e43c4986a-utilities\") pod \"redhat-operators-7lp6r\" (UID: \"48a7441f-b4b6-44e1-9b00-a34e43c4986a\") " pod="openshift-marketplace/redhat-operators-7lp6r" Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.416297 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48a7441f-b4b6-44e1-9b00-a34e43c4986a-catalog-content\") pod \"redhat-operators-7lp6r\" (UID: \"48a7441f-b4b6-44e1-9b00-a34e43c4986a\") " pod="openshift-marketplace/redhat-operators-7lp6r" Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.416324 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nht5d\" (UniqueName: \"kubernetes.io/projected/48a7441f-b4b6-44e1-9b00-a34e43c4986a-kube-api-access-nht5d\") pod \"redhat-operators-7lp6r\" (UID: \"48a7441f-b4b6-44e1-9b00-a34e43c4986a\") " pod="openshift-marketplace/redhat-operators-7lp6r" Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.419379 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48a7441f-b4b6-44e1-9b00-a34e43c4986a-catalog-content\") pod \"redhat-operators-7lp6r\" (UID: \"48a7441f-b4b6-44e1-9b00-a34e43c4986a\") " pod="openshift-marketplace/redhat-operators-7lp6r" Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.419456 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48a7441f-b4b6-44e1-9b00-a34e43c4986a-utilities\") pod \"redhat-operators-7lp6r\" (UID: \"48a7441f-b4b6-44e1-9b00-a34e43c4986a\") " pod="openshift-marketplace/redhat-operators-7lp6r" Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.454131 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nht5d\" (UniqueName: \"kubernetes.io/projected/48a7441f-b4b6-44e1-9b00-a34e43c4986a-kube-api-access-nht5d\") pod \"redhat-operators-7lp6r\" (UID: \"48a7441f-b4b6-44e1-9b00-a34e43c4986a\") " pod="openshift-marketplace/redhat-operators-7lp6r" Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.477503 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-49h8n" Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.491611 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7lp6r" Nov 25 16:00:05 crc kubenswrapper[4704]: I1125 16:00:05.964173 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7lp6r"] Nov 25 16:00:06 crc kubenswrapper[4704]: I1125 16:00:06.414445 4704 generic.go:334] "Generic (PLEG): container finished" podID="48a7441f-b4b6-44e1-9b00-a34e43c4986a" containerID="0b8ea1bd2deb78cdf1513c028db3249b501a737bd42302aff0d591ac8011663c" exitCode=0 Nov 25 16:00:06 crc kubenswrapper[4704]: I1125 16:00:06.414508 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7lp6r" event={"ID":"48a7441f-b4b6-44e1-9b00-a34e43c4986a","Type":"ContainerDied","Data":"0b8ea1bd2deb78cdf1513c028db3249b501a737bd42302aff0d591ac8011663c"} Nov 25 16:00:06 crc kubenswrapper[4704]: I1125 16:00:06.414585 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7lp6r" event={"ID":"48a7441f-b4b6-44e1-9b00-a34e43c4986a","Type":"ContainerStarted","Data":"9cb2f96f148c566ca6a9a768a697b9e32814d5b31e7833944572c26a234ea79d"} Nov 25 16:00:06 crc kubenswrapper[4704]: I1125 16:00:06.425184 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qwsvr" event={"ID":"b0b38583-f22a-45dc-b1f6-1fb67f046f2d","Type":"ContainerStarted","Data":"33415a4e661564fbfa3edf2d1b56710c8685512bc8e593aa388b0867343e0ccf"} Nov 25 16:00:06 crc kubenswrapper[4704]: I1125 16:00:06.470671 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qwsvr" podStartSLOduration=3.019595219 podStartE2EDuration="4.470647196s" podCreationTimestamp="2025-11-25 16:00:02 +0000 UTC" firstStartedPulling="2025-11-25 16:00:04.387219623 +0000 UTC m=+1490.655493404" lastFinishedPulling="2025-11-25 16:00:05.8382716 +0000 UTC m=+1492.106545381" observedRunningTime="2025-11-25 16:00:06.463932702 +0000 UTC m=+1492.732206483" watchObservedRunningTime="2025-11-25 16:00:06.470647196 +0000 UTC m=+1492.738920977" Nov 25 16:00:07 crc kubenswrapper[4704]: I1125 16:00:07.428121 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7lp6r" event={"ID":"48a7441f-b4b6-44e1-9b00-a34e43c4986a","Type":"ContainerStarted","Data":"4a570ec826ba85026e771eadfef16707e94c410674a741c29a3ae4643ea6100f"} Nov 25 16:00:07 crc kubenswrapper[4704]: I1125 16:00:07.532627 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-49h8n"] Nov 25 16:00:07 crc kubenswrapper[4704]: I1125 16:00:07.532939 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-49h8n" podUID="11c7083d-4abd-4a07-99e0-dd218f9a145b" containerName="registry-server" containerID="cri-o://7e9c8cec97ba3d417e4bf74374b2fa06bcc59044e6c721e9db14784190d44b26" gracePeriod=2 Nov 25 16:00:07 crc kubenswrapper[4704]: I1125 16:00:07.965071 4704 patch_prober.go:28] interesting pod/machine-config-daemon-djz8x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:00:07 crc kubenswrapper[4704]: I1125 16:00:07.965574 4704 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:00:07 crc kubenswrapper[4704]: I1125 16:00:07.965618 4704 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" Nov 25 16:00:07 crc kubenswrapper[4704]: I1125 16:00:07.966331 4704 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7a574ca126e29dffea7335dc4eb45068ba5d1201355a4dba5124c9c343f20921"} pod="openshift-machine-config-operator/machine-config-daemon-djz8x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 16:00:07 crc kubenswrapper[4704]: I1125 16:00:07.966384 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" containerID="cri-o://7a574ca126e29dffea7335dc4eb45068ba5d1201355a4dba5124c9c343f20921" gracePeriod=600 Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.139182 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-49h8n" Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.306982 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11c7083d-4abd-4a07-99e0-dd218f9a145b-utilities\") pod \"11c7083d-4abd-4a07-99e0-dd218f9a145b\" (UID: \"11c7083d-4abd-4a07-99e0-dd218f9a145b\") " Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.307017 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11c7083d-4abd-4a07-99e0-dd218f9a145b-catalog-content\") pod \"11c7083d-4abd-4a07-99e0-dd218f9a145b\" (UID: \"11c7083d-4abd-4a07-99e0-dd218f9a145b\") " Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.307087 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k66rh\" (UniqueName: \"kubernetes.io/projected/11c7083d-4abd-4a07-99e0-dd218f9a145b-kube-api-access-k66rh\") pod \"11c7083d-4abd-4a07-99e0-dd218f9a145b\" (UID: \"11c7083d-4abd-4a07-99e0-dd218f9a145b\") " Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.308001 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11c7083d-4abd-4a07-99e0-dd218f9a145b-utilities" (OuterVolumeSpecName: "utilities") pod "11c7083d-4abd-4a07-99e0-dd218f9a145b" (UID: "11c7083d-4abd-4a07-99e0-dd218f9a145b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.314417 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11c7083d-4abd-4a07-99e0-dd218f9a145b-kube-api-access-k66rh" (OuterVolumeSpecName: "kube-api-access-k66rh") pod "11c7083d-4abd-4a07-99e0-dd218f9a145b" (UID: "11c7083d-4abd-4a07-99e0-dd218f9a145b"). InnerVolumeSpecName "kube-api-access-k66rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.409431 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k66rh\" (UniqueName: \"kubernetes.io/projected/11c7083d-4abd-4a07-99e0-dd218f9a145b-kube-api-access-k66rh\") on node \"crc\" DevicePath \"\"" Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.409842 4704 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11c7083d-4abd-4a07-99e0-dd218f9a145b-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.446557 4704 generic.go:334] "Generic (PLEG): container finished" podID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerID="7a574ca126e29dffea7335dc4eb45068ba5d1201355a4dba5124c9c343f20921" exitCode=0 Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.446635 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" event={"ID":"91b52682-d008-4b8a-8bc3-26b032d7dc2c","Type":"ContainerDied","Data":"7a574ca126e29dffea7335dc4eb45068ba5d1201355a4dba5124c9c343f20921"} Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.446690 4704 scope.go:117] "RemoveContainer" containerID="ed4f107353622069826562153315aa9eb23b779c9df0b35ea109bbd82177caad" Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.449031 4704 generic.go:334] "Generic (PLEG): container finished" podID="48a7441f-b4b6-44e1-9b00-a34e43c4986a" containerID="4a570ec826ba85026e771eadfef16707e94c410674a741c29a3ae4643ea6100f" exitCode=0 Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.449123 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7lp6r" event={"ID":"48a7441f-b4b6-44e1-9b00-a34e43c4986a","Type":"ContainerDied","Data":"4a570ec826ba85026e771eadfef16707e94c410674a741c29a3ae4643ea6100f"} Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.453462 4704 generic.go:334] "Generic (PLEG): container finished" podID="11c7083d-4abd-4a07-99e0-dd218f9a145b" containerID="7e9c8cec97ba3d417e4bf74374b2fa06bcc59044e6c721e9db14784190d44b26" exitCode=0 Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.453507 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49h8n" event={"ID":"11c7083d-4abd-4a07-99e0-dd218f9a145b","Type":"ContainerDied","Data":"7e9c8cec97ba3d417e4bf74374b2fa06bcc59044e6c721e9db14784190d44b26"} Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.453530 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49h8n" event={"ID":"11c7083d-4abd-4a07-99e0-dd218f9a145b","Type":"ContainerDied","Data":"fb0060621f2a7e0702f814cc7c763c3e9264de13b191c487a4758b5b032673fb"} Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.453599 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-49h8n" Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.482544 4704 scope.go:117] "RemoveContainer" containerID="7e9c8cec97ba3d417e4bf74374b2fa06bcc59044e6c721e9db14784190d44b26" Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.500418 4704 scope.go:117] "RemoveContainer" containerID="0bcd485d50f4f71d8baa6e77d2596ed0aea41601937538485905644798e31141" Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.521737 4704 scope.go:117] "RemoveContainer" containerID="981484de944a4f30a774f10d144d9fd7102211b29a1acf133c1228752f950a24" Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.548250 4704 scope.go:117] "RemoveContainer" containerID="7e9c8cec97ba3d417e4bf74374b2fa06bcc59044e6c721e9db14784190d44b26" Nov 25 16:00:08 crc kubenswrapper[4704]: E1125 16:00:08.549271 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e9c8cec97ba3d417e4bf74374b2fa06bcc59044e6c721e9db14784190d44b26\": container with ID starting with 7e9c8cec97ba3d417e4bf74374b2fa06bcc59044e6c721e9db14784190d44b26 not found: ID does not exist" containerID="7e9c8cec97ba3d417e4bf74374b2fa06bcc59044e6c721e9db14784190d44b26" Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.549317 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e9c8cec97ba3d417e4bf74374b2fa06bcc59044e6c721e9db14784190d44b26"} err="failed to get container status \"7e9c8cec97ba3d417e4bf74374b2fa06bcc59044e6c721e9db14784190d44b26\": rpc error: code = NotFound desc = could not find container \"7e9c8cec97ba3d417e4bf74374b2fa06bcc59044e6c721e9db14784190d44b26\": container with ID starting with 7e9c8cec97ba3d417e4bf74374b2fa06bcc59044e6c721e9db14784190d44b26 not found: ID does not exist" Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.549348 4704 scope.go:117] "RemoveContainer" containerID="0bcd485d50f4f71d8baa6e77d2596ed0aea41601937538485905644798e31141" Nov 25 16:00:08 crc kubenswrapper[4704]: E1125 16:00:08.550101 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bcd485d50f4f71d8baa6e77d2596ed0aea41601937538485905644798e31141\": container with ID starting with 0bcd485d50f4f71d8baa6e77d2596ed0aea41601937538485905644798e31141 not found: ID does not exist" containerID="0bcd485d50f4f71d8baa6e77d2596ed0aea41601937538485905644798e31141" Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.550125 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bcd485d50f4f71d8baa6e77d2596ed0aea41601937538485905644798e31141"} err="failed to get container status \"0bcd485d50f4f71d8baa6e77d2596ed0aea41601937538485905644798e31141\": rpc error: code = NotFound desc = could not find container \"0bcd485d50f4f71d8baa6e77d2596ed0aea41601937538485905644798e31141\": container with ID starting with 0bcd485d50f4f71d8baa6e77d2596ed0aea41601937538485905644798e31141 not found: ID does not exist" Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.550138 4704 scope.go:117] "RemoveContainer" containerID="981484de944a4f30a774f10d144d9fd7102211b29a1acf133c1228752f950a24" Nov 25 16:00:08 crc kubenswrapper[4704]: E1125 16:00:08.550487 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"981484de944a4f30a774f10d144d9fd7102211b29a1acf133c1228752f950a24\": container with ID starting with 981484de944a4f30a774f10d144d9fd7102211b29a1acf133c1228752f950a24 not found: ID does not exist" containerID="981484de944a4f30a774f10d144d9fd7102211b29a1acf133c1228752f950a24" Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.550535 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"981484de944a4f30a774f10d144d9fd7102211b29a1acf133c1228752f950a24"} err="failed to get container status \"981484de944a4f30a774f10d144d9fd7102211b29a1acf133c1228752f950a24\": rpc error: code = NotFound desc = could not find container \"981484de944a4f30a774f10d144d9fd7102211b29a1acf133c1228752f950a24\": container with ID starting with 981484de944a4f30a774f10d144d9fd7102211b29a1acf133c1228752f950a24 not found: ID does not exist" Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.657012 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11c7083d-4abd-4a07-99e0-dd218f9a145b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "11c7083d-4abd-4a07-99e0-dd218f9a145b" (UID: "11c7083d-4abd-4a07-99e0-dd218f9a145b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.714142 4704 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11c7083d-4abd-4a07-99e0-dd218f9a145b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.788245 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-49h8n"] Nov 25 16:00:08 crc kubenswrapper[4704]: I1125 16:00:08.793897 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-49h8n"] Nov 25 16:00:09 crc kubenswrapper[4704]: I1125 16:00:09.472834 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" event={"ID":"91b52682-d008-4b8a-8bc3-26b032d7dc2c","Type":"ContainerStarted","Data":"2d40455a2b5295ff7b13e8f8c3068640929072e4f22451a92dc5d7701185f9fd"} Nov 25 16:00:09 crc kubenswrapper[4704]: I1125 16:00:09.476426 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7lp6r" event={"ID":"48a7441f-b4b6-44e1-9b00-a34e43c4986a","Type":"ContainerStarted","Data":"4f22076403dbd34367638c010e76593cc0c03dec06a1ca73e1ad5ff972a57fb2"} Nov 25 16:00:09 crc kubenswrapper[4704]: I1125 16:00:09.506543 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7lp6r" podStartSLOduration=1.831923132 podStartE2EDuration="4.506526101s" podCreationTimestamp="2025-11-25 16:00:05 +0000 UTC" firstStartedPulling="2025-11-25 16:00:06.416747277 +0000 UTC m=+1492.685021058" lastFinishedPulling="2025-11-25 16:00:09.091350256 +0000 UTC m=+1495.359624027" observedRunningTime="2025-11-25 16:00:09.503525974 +0000 UTC m=+1495.771799755" watchObservedRunningTime="2025-11-25 16:00:09.506526101 +0000 UTC m=+1495.774799882" Nov 25 16:00:10 crc kubenswrapper[4704]: I1125 16:00:10.423551 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11c7083d-4abd-4a07-99e0-dd218f9a145b" path="/var/lib/kubelet/pods/11c7083d-4abd-4a07-99e0-dd218f9a145b/volumes" Nov 25 16:00:12 crc kubenswrapper[4704]: I1125 16:00:12.894347 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qwsvr" Nov 25 16:00:12 crc kubenswrapper[4704]: I1125 16:00:12.896368 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qwsvr" Nov 25 16:00:12 crc kubenswrapper[4704]: I1125 16:00:12.948983 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qwsvr" Nov 25 16:00:13 crc kubenswrapper[4704]: I1125 16:00:13.550900 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qwsvr" Nov 25 16:00:14 crc kubenswrapper[4704]: I1125 16:00:14.335599 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qwsvr"] Nov 25 16:00:15 crc kubenswrapper[4704]: I1125 16:00:15.493689 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7lp6r" Nov 25 16:00:15 crc kubenswrapper[4704]: I1125 16:00:15.495018 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7lp6r" Nov 25 16:00:15 crc kubenswrapper[4704]: I1125 16:00:15.522019 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qwsvr" podUID="b0b38583-f22a-45dc-b1f6-1fb67f046f2d" containerName="registry-server" containerID="cri-o://33415a4e661564fbfa3edf2d1b56710c8685512bc8e593aa388b0867343e0ccf" gracePeriod=2 Nov 25 16:00:15 crc kubenswrapper[4704]: I1125 16:00:15.547203 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7lp6r" Nov 25 16:00:16 crc kubenswrapper[4704]: I1125 16:00:16.409107 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qwsvr" Nov 25 16:00:16 crc kubenswrapper[4704]: I1125 16:00:16.531817 4704 generic.go:334] "Generic (PLEG): container finished" podID="b0b38583-f22a-45dc-b1f6-1fb67f046f2d" containerID="33415a4e661564fbfa3edf2d1b56710c8685512bc8e593aa388b0867343e0ccf" exitCode=0 Nov 25 16:00:16 crc kubenswrapper[4704]: I1125 16:00:16.531852 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qwsvr" Nov 25 16:00:16 crc kubenswrapper[4704]: I1125 16:00:16.531886 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qwsvr" event={"ID":"b0b38583-f22a-45dc-b1f6-1fb67f046f2d","Type":"ContainerDied","Data":"33415a4e661564fbfa3edf2d1b56710c8685512bc8e593aa388b0867343e0ccf"} Nov 25 16:00:16 crc kubenswrapper[4704]: I1125 16:00:16.531913 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qwsvr" event={"ID":"b0b38583-f22a-45dc-b1f6-1fb67f046f2d","Type":"ContainerDied","Data":"3db84b2b8c81e685d646935003fa60dfbd268ad5822e97cb7c2b70c55c8b9d4d"} Nov 25 16:00:16 crc kubenswrapper[4704]: I1125 16:00:16.531931 4704 scope.go:117] "RemoveContainer" containerID="33415a4e661564fbfa3edf2d1b56710c8685512bc8e593aa388b0867343e0ccf" Nov 25 16:00:16 crc kubenswrapper[4704]: I1125 16:00:16.535005 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0b38583-f22a-45dc-b1f6-1fb67f046f2d-catalog-content\") pod \"b0b38583-f22a-45dc-b1f6-1fb67f046f2d\" (UID: \"b0b38583-f22a-45dc-b1f6-1fb67f046f2d\") " Nov 25 16:00:16 crc kubenswrapper[4704]: I1125 16:00:16.535251 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntxv8\" (UniqueName: \"kubernetes.io/projected/b0b38583-f22a-45dc-b1f6-1fb67f046f2d-kube-api-access-ntxv8\") pod \"b0b38583-f22a-45dc-b1f6-1fb67f046f2d\" (UID: \"b0b38583-f22a-45dc-b1f6-1fb67f046f2d\") " Nov 25 16:00:16 crc kubenswrapper[4704]: I1125 16:00:16.535401 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0b38583-f22a-45dc-b1f6-1fb67f046f2d-utilities\") pod \"b0b38583-f22a-45dc-b1f6-1fb67f046f2d\" (UID: \"b0b38583-f22a-45dc-b1f6-1fb67f046f2d\") " Nov 25 16:00:16 crc kubenswrapper[4704]: I1125 16:00:16.536289 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0b38583-f22a-45dc-b1f6-1fb67f046f2d-utilities" (OuterVolumeSpecName: "utilities") pod "b0b38583-f22a-45dc-b1f6-1fb67f046f2d" (UID: "b0b38583-f22a-45dc-b1f6-1fb67f046f2d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:00:16 crc kubenswrapper[4704]: I1125 16:00:16.547166 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0b38583-f22a-45dc-b1f6-1fb67f046f2d-kube-api-access-ntxv8" (OuterVolumeSpecName: "kube-api-access-ntxv8") pod "b0b38583-f22a-45dc-b1f6-1fb67f046f2d" (UID: "b0b38583-f22a-45dc-b1f6-1fb67f046f2d"). InnerVolumeSpecName "kube-api-access-ntxv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:00:16 crc kubenswrapper[4704]: I1125 16:00:16.552574 4704 scope.go:117] "RemoveContainer" containerID="cfbea08ee5a4fcac5c97dd9db771327534d105b5563f120be6c08b22d8860031" Nov 25 16:00:16 crc kubenswrapper[4704]: I1125 16:00:16.555091 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0b38583-f22a-45dc-b1f6-1fb67f046f2d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0b38583-f22a-45dc-b1f6-1fb67f046f2d" (UID: "b0b38583-f22a-45dc-b1f6-1fb67f046f2d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:00:16 crc kubenswrapper[4704]: I1125 16:00:16.587272 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7lp6r" Nov 25 16:00:16 crc kubenswrapper[4704]: I1125 16:00:16.593023 4704 scope.go:117] "RemoveContainer" containerID="13679ea0bdf964fb13f2819ad6203833b1f079e22c3c3567c3c504c7ee7eb226" Nov 25 16:00:16 crc kubenswrapper[4704]: I1125 16:00:16.616317 4704 scope.go:117] "RemoveContainer" containerID="33415a4e661564fbfa3edf2d1b56710c8685512bc8e593aa388b0867343e0ccf" Nov 25 16:00:16 crc kubenswrapper[4704]: E1125 16:00:16.616941 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33415a4e661564fbfa3edf2d1b56710c8685512bc8e593aa388b0867343e0ccf\": container with ID starting with 33415a4e661564fbfa3edf2d1b56710c8685512bc8e593aa388b0867343e0ccf not found: ID does not exist" containerID="33415a4e661564fbfa3edf2d1b56710c8685512bc8e593aa388b0867343e0ccf" Nov 25 16:00:16 crc kubenswrapper[4704]: I1125 16:00:16.616993 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33415a4e661564fbfa3edf2d1b56710c8685512bc8e593aa388b0867343e0ccf"} err="failed to get container status \"33415a4e661564fbfa3edf2d1b56710c8685512bc8e593aa388b0867343e0ccf\": rpc error: code = NotFound desc = could not find container \"33415a4e661564fbfa3edf2d1b56710c8685512bc8e593aa388b0867343e0ccf\": container with ID starting with 33415a4e661564fbfa3edf2d1b56710c8685512bc8e593aa388b0867343e0ccf not found: ID does not exist" Nov 25 16:00:16 crc kubenswrapper[4704]: I1125 16:00:16.617023 4704 scope.go:117] "RemoveContainer" containerID="cfbea08ee5a4fcac5c97dd9db771327534d105b5563f120be6c08b22d8860031" Nov 25 16:00:16 crc kubenswrapper[4704]: E1125 16:00:16.617510 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfbea08ee5a4fcac5c97dd9db771327534d105b5563f120be6c08b22d8860031\": container with ID starting with cfbea08ee5a4fcac5c97dd9db771327534d105b5563f120be6c08b22d8860031 not found: ID does not exist" containerID="cfbea08ee5a4fcac5c97dd9db771327534d105b5563f120be6c08b22d8860031" Nov 25 16:00:16 crc kubenswrapper[4704]: I1125 16:00:16.617531 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfbea08ee5a4fcac5c97dd9db771327534d105b5563f120be6c08b22d8860031"} err="failed to get container status \"cfbea08ee5a4fcac5c97dd9db771327534d105b5563f120be6c08b22d8860031\": rpc error: code = NotFound desc = could not find container \"cfbea08ee5a4fcac5c97dd9db771327534d105b5563f120be6c08b22d8860031\": container with ID starting with cfbea08ee5a4fcac5c97dd9db771327534d105b5563f120be6c08b22d8860031 not found: ID does not exist" Nov 25 16:00:16 crc kubenswrapper[4704]: I1125 16:00:16.617542 4704 scope.go:117] "RemoveContainer" containerID="13679ea0bdf964fb13f2819ad6203833b1f079e22c3c3567c3c504c7ee7eb226" Nov 25 16:00:16 crc kubenswrapper[4704]: E1125 16:00:16.617882 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13679ea0bdf964fb13f2819ad6203833b1f079e22c3c3567c3c504c7ee7eb226\": container with ID starting with 13679ea0bdf964fb13f2819ad6203833b1f079e22c3c3567c3c504c7ee7eb226 not found: ID does not exist" containerID="13679ea0bdf964fb13f2819ad6203833b1f079e22c3c3567c3c504c7ee7eb226" Nov 25 16:00:16 crc kubenswrapper[4704]: I1125 16:00:16.617900 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13679ea0bdf964fb13f2819ad6203833b1f079e22c3c3567c3c504c7ee7eb226"} err="failed to get container status \"13679ea0bdf964fb13f2819ad6203833b1f079e22c3c3567c3c504c7ee7eb226\": rpc error: code = NotFound desc = could not find container \"13679ea0bdf964fb13f2819ad6203833b1f079e22c3c3567c3c504c7ee7eb226\": container with ID starting with 13679ea0bdf964fb13f2819ad6203833b1f079e22c3c3567c3c504c7ee7eb226 not found: ID does not exist" Nov 25 16:00:16 crc kubenswrapper[4704]: I1125 16:00:16.636978 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntxv8\" (UniqueName: \"kubernetes.io/projected/b0b38583-f22a-45dc-b1f6-1fb67f046f2d-kube-api-access-ntxv8\") on node \"crc\" DevicePath \"\"" Nov 25 16:00:16 crc kubenswrapper[4704]: I1125 16:00:16.637024 4704 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0b38583-f22a-45dc-b1f6-1fb67f046f2d-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:00:16 crc kubenswrapper[4704]: I1125 16:00:16.637035 4704 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0b38583-f22a-45dc-b1f6-1fb67f046f2d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:00:16 crc kubenswrapper[4704]: I1125 16:00:16.841364 4704 scope.go:117] "RemoveContainer" containerID="d9fa378769c2cf156dab36ce579906fa433f83b060ba3c1ebc27a5daba6cfdf8" Nov 25 16:00:16 crc kubenswrapper[4704]: I1125 16:00:16.866249 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qwsvr"] Nov 25 16:00:16 crc kubenswrapper[4704]: I1125 16:00:16.869306 4704 scope.go:117] "RemoveContainer" containerID="09142a0aee28f2c84807a42113649ee1fcd77643722796f0dcd1c61f3772e057" Nov 25 16:00:16 crc kubenswrapper[4704]: I1125 16:00:16.875077 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qwsvr"] Nov 25 16:00:16 crc kubenswrapper[4704]: I1125 16:00:16.981447 4704 scope.go:117] "RemoveContainer" containerID="edde44852ee9a22b0b119321083e622f279f11ec499478e232c9c78c0c5b4909" Nov 25 16:00:18 crc kubenswrapper[4704]: I1125 16:00:18.426136 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0b38583-f22a-45dc-b1f6-1fb67f046f2d" path="/var/lib/kubelet/pods/b0b38583-f22a-45dc-b1f6-1fb67f046f2d/volumes" Nov 25 16:00:18 crc kubenswrapper[4704]: I1125 16:00:18.936090 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7lp6r"] Nov 25 16:00:18 crc kubenswrapper[4704]: I1125 16:00:18.936337 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7lp6r" podUID="48a7441f-b4b6-44e1-9b00-a34e43c4986a" containerName="registry-server" containerID="cri-o://4f22076403dbd34367638c010e76593cc0c03dec06a1ca73e1ad5ff972a57fb2" gracePeriod=2 Nov 25 16:00:19 crc kubenswrapper[4704]: I1125 16:00:19.368206 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7lp6r" Nov 25 16:00:19 crc kubenswrapper[4704]: I1125 16:00:19.476034 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nht5d\" (UniqueName: \"kubernetes.io/projected/48a7441f-b4b6-44e1-9b00-a34e43c4986a-kube-api-access-nht5d\") pod \"48a7441f-b4b6-44e1-9b00-a34e43c4986a\" (UID: \"48a7441f-b4b6-44e1-9b00-a34e43c4986a\") " Nov 25 16:00:19 crc kubenswrapper[4704]: I1125 16:00:19.476933 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48a7441f-b4b6-44e1-9b00-a34e43c4986a-utilities\") pod \"48a7441f-b4b6-44e1-9b00-a34e43c4986a\" (UID: \"48a7441f-b4b6-44e1-9b00-a34e43c4986a\") " Nov 25 16:00:19 crc kubenswrapper[4704]: I1125 16:00:19.476993 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48a7441f-b4b6-44e1-9b00-a34e43c4986a-catalog-content\") pod \"48a7441f-b4b6-44e1-9b00-a34e43c4986a\" (UID: \"48a7441f-b4b6-44e1-9b00-a34e43c4986a\") " Nov 25 16:00:19 crc kubenswrapper[4704]: I1125 16:00:19.479112 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48a7441f-b4b6-44e1-9b00-a34e43c4986a-utilities" (OuterVolumeSpecName: "utilities") pod "48a7441f-b4b6-44e1-9b00-a34e43c4986a" (UID: "48a7441f-b4b6-44e1-9b00-a34e43c4986a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:00:19 crc kubenswrapper[4704]: I1125 16:00:19.483067 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48a7441f-b4b6-44e1-9b00-a34e43c4986a-kube-api-access-nht5d" (OuterVolumeSpecName: "kube-api-access-nht5d") pod "48a7441f-b4b6-44e1-9b00-a34e43c4986a" (UID: "48a7441f-b4b6-44e1-9b00-a34e43c4986a"). InnerVolumeSpecName "kube-api-access-nht5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:00:19 crc kubenswrapper[4704]: I1125 16:00:19.555584 4704 generic.go:334] "Generic (PLEG): container finished" podID="48a7441f-b4b6-44e1-9b00-a34e43c4986a" containerID="4f22076403dbd34367638c010e76593cc0c03dec06a1ca73e1ad5ff972a57fb2" exitCode=0 Nov 25 16:00:19 crc kubenswrapper[4704]: I1125 16:00:19.555637 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7lp6r" event={"ID":"48a7441f-b4b6-44e1-9b00-a34e43c4986a","Type":"ContainerDied","Data":"4f22076403dbd34367638c010e76593cc0c03dec06a1ca73e1ad5ff972a57fb2"} Nov 25 16:00:19 crc kubenswrapper[4704]: I1125 16:00:19.555671 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7lp6r" event={"ID":"48a7441f-b4b6-44e1-9b00-a34e43c4986a","Type":"ContainerDied","Data":"9cb2f96f148c566ca6a9a768a697b9e32814d5b31e7833944572c26a234ea79d"} Nov 25 16:00:19 crc kubenswrapper[4704]: I1125 16:00:19.555692 4704 scope.go:117] "RemoveContainer" containerID="4f22076403dbd34367638c010e76593cc0c03dec06a1ca73e1ad5ff972a57fb2" Nov 25 16:00:19 crc kubenswrapper[4704]: I1125 16:00:19.555810 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7lp6r" Nov 25 16:00:19 crc kubenswrapper[4704]: I1125 16:00:19.565850 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48a7441f-b4b6-44e1-9b00-a34e43c4986a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48a7441f-b4b6-44e1-9b00-a34e43c4986a" (UID: "48a7441f-b4b6-44e1-9b00-a34e43c4986a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:00:19 crc kubenswrapper[4704]: I1125 16:00:19.583374 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nht5d\" (UniqueName: \"kubernetes.io/projected/48a7441f-b4b6-44e1-9b00-a34e43c4986a-kube-api-access-nht5d\") on node \"crc\" DevicePath \"\"" Nov 25 16:00:19 crc kubenswrapper[4704]: I1125 16:00:19.583432 4704 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48a7441f-b4b6-44e1-9b00-a34e43c4986a-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:00:19 crc kubenswrapper[4704]: I1125 16:00:19.583443 4704 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48a7441f-b4b6-44e1-9b00-a34e43c4986a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:00:19 crc kubenswrapper[4704]: I1125 16:00:19.589732 4704 scope.go:117] "RemoveContainer" containerID="4a570ec826ba85026e771eadfef16707e94c410674a741c29a3ae4643ea6100f" Nov 25 16:00:19 crc kubenswrapper[4704]: I1125 16:00:19.609598 4704 scope.go:117] "RemoveContainer" containerID="0b8ea1bd2deb78cdf1513c028db3249b501a737bd42302aff0d591ac8011663c" Nov 25 16:00:19 crc kubenswrapper[4704]: I1125 16:00:19.638295 4704 scope.go:117] "RemoveContainer" containerID="4f22076403dbd34367638c010e76593cc0c03dec06a1ca73e1ad5ff972a57fb2" Nov 25 16:00:19 crc kubenswrapper[4704]: E1125 16:00:19.638839 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f22076403dbd34367638c010e76593cc0c03dec06a1ca73e1ad5ff972a57fb2\": container with ID starting with 4f22076403dbd34367638c010e76593cc0c03dec06a1ca73e1ad5ff972a57fb2 not found: ID does not exist" containerID="4f22076403dbd34367638c010e76593cc0c03dec06a1ca73e1ad5ff972a57fb2" Nov 25 16:00:19 crc kubenswrapper[4704]: I1125 16:00:19.638903 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f22076403dbd34367638c010e76593cc0c03dec06a1ca73e1ad5ff972a57fb2"} err="failed to get container status \"4f22076403dbd34367638c010e76593cc0c03dec06a1ca73e1ad5ff972a57fb2\": rpc error: code = NotFound desc = could not find container \"4f22076403dbd34367638c010e76593cc0c03dec06a1ca73e1ad5ff972a57fb2\": container with ID starting with 4f22076403dbd34367638c010e76593cc0c03dec06a1ca73e1ad5ff972a57fb2 not found: ID does not exist" Nov 25 16:00:19 crc kubenswrapper[4704]: I1125 16:00:19.638930 4704 scope.go:117] "RemoveContainer" containerID="4a570ec826ba85026e771eadfef16707e94c410674a741c29a3ae4643ea6100f" Nov 25 16:00:19 crc kubenswrapper[4704]: E1125 16:00:19.639427 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a570ec826ba85026e771eadfef16707e94c410674a741c29a3ae4643ea6100f\": container with ID starting with 4a570ec826ba85026e771eadfef16707e94c410674a741c29a3ae4643ea6100f not found: ID does not exist" containerID="4a570ec826ba85026e771eadfef16707e94c410674a741c29a3ae4643ea6100f" Nov 25 16:00:19 crc kubenswrapper[4704]: I1125 16:00:19.639477 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a570ec826ba85026e771eadfef16707e94c410674a741c29a3ae4643ea6100f"} err="failed to get container status \"4a570ec826ba85026e771eadfef16707e94c410674a741c29a3ae4643ea6100f\": rpc error: code = NotFound desc = could not find container \"4a570ec826ba85026e771eadfef16707e94c410674a741c29a3ae4643ea6100f\": container with ID starting with 4a570ec826ba85026e771eadfef16707e94c410674a741c29a3ae4643ea6100f not found: ID does not exist" Nov 25 16:00:19 crc kubenswrapper[4704]: I1125 16:00:19.639506 4704 scope.go:117] "RemoveContainer" containerID="0b8ea1bd2deb78cdf1513c028db3249b501a737bd42302aff0d591ac8011663c" Nov 25 16:00:19 crc kubenswrapper[4704]: E1125 16:00:19.639824 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b8ea1bd2deb78cdf1513c028db3249b501a737bd42302aff0d591ac8011663c\": container with ID starting with 0b8ea1bd2deb78cdf1513c028db3249b501a737bd42302aff0d591ac8011663c not found: ID does not exist" containerID="0b8ea1bd2deb78cdf1513c028db3249b501a737bd42302aff0d591ac8011663c" Nov 25 16:00:19 crc kubenswrapper[4704]: I1125 16:00:19.639853 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b8ea1bd2deb78cdf1513c028db3249b501a737bd42302aff0d591ac8011663c"} err="failed to get container status \"0b8ea1bd2deb78cdf1513c028db3249b501a737bd42302aff0d591ac8011663c\": rpc error: code = NotFound desc = could not find container \"0b8ea1bd2deb78cdf1513c028db3249b501a737bd42302aff0d591ac8011663c\": container with ID starting with 0b8ea1bd2deb78cdf1513c028db3249b501a737bd42302aff0d591ac8011663c not found: ID does not exist" Nov 25 16:00:19 crc kubenswrapper[4704]: I1125 16:00:19.901331 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7lp6r"] Nov 25 16:00:19 crc kubenswrapper[4704]: I1125 16:00:19.906989 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7lp6r"] Nov 25 16:00:20 crc kubenswrapper[4704]: I1125 16:00:20.425891 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48a7441f-b4b6-44e1-9b00-a34e43c4986a" path="/var/lib/kubelet/pods/48a7441f-b4b6-44e1-9b00-a34e43c4986a/volumes" Nov 25 16:01:00 crc kubenswrapper[4704]: I1125 16:01:00.144705 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-cron-29401441-64x6t"] Nov 25 16:01:00 crc kubenswrapper[4704]: E1125 16:01:00.145905 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11c7083d-4abd-4a07-99e0-dd218f9a145b" containerName="extract-content" Nov 25 16:01:00 crc kubenswrapper[4704]: I1125 16:01:00.145924 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c7083d-4abd-4a07-99e0-dd218f9a145b" containerName="extract-content" Nov 25 16:01:00 crc kubenswrapper[4704]: E1125 16:01:00.145942 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b38583-f22a-45dc-b1f6-1fb67f046f2d" containerName="extract-content" Nov 25 16:01:00 crc kubenswrapper[4704]: I1125 16:01:00.145950 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b38583-f22a-45dc-b1f6-1fb67f046f2d" containerName="extract-content" Nov 25 16:01:00 crc kubenswrapper[4704]: E1125 16:01:00.145969 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b38583-f22a-45dc-b1f6-1fb67f046f2d" containerName="extract-utilities" Nov 25 16:01:00 crc kubenswrapper[4704]: I1125 16:01:00.145977 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b38583-f22a-45dc-b1f6-1fb67f046f2d" containerName="extract-utilities" Nov 25 16:01:00 crc kubenswrapper[4704]: E1125 16:01:00.145988 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48a7441f-b4b6-44e1-9b00-a34e43c4986a" containerName="extract-content" Nov 25 16:01:00 crc kubenswrapper[4704]: I1125 16:01:00.145995 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="48a7441f-b4b6-44e1-9b00-a34e43c4986a" containerName="extract-content" Nov 25 16:01:00 crc kubenswrapper[4704]: E1125 16:01:00.146027 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48a7441f-b4b6-44e1-9b00-a34e43c4986a" containerName="extract-utilities" Nov 25 16:01:00 crc kubenswrapper[4704]: I1125 16:01:00.146035 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="48a7441f-b4b6-44e1-9b00-a34e43c4986a" containerName="extract-utilities" Nov 25 16:01:00 crc kubenswrapper[4704]: E1125 16:01:00.146049 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48a7441f-b4b6-44e1-9b00-a34e43c4986a" containerName="registry-server" Nov 25 16:01:00 crc kubenswrapper[4704]: I1125 16:01:00.146056 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="48a7441f-b4b6-44e1-9b00-a34e43c4986a" containerName="registry-server" Nov 25 16:01:00 crc kubenswrapper[4704]: E1125 16:01:00.146070 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11c7083d-4abd-4a07-99e0-dd218f9a145b" containerName="extract-utilities" Nov 25 16:01:00 crc kubenswrapper[4704]: I1125 16:01:00.146077 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c7083d-4abd-4a07-99e0-dd218f9a145b" containerName="extract-utilities" Nov 25 16:01:00 crc kubenswrapper[4704]: E1125 16:01:00.146087 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b38583-f22a-45dc-b1f6-1fb67f046f2d" containerName="registry-server" Nov 25 16:01:00 crc kubenswrapper[4704]: I1125 16:01:00.146096 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b38583-f22a-45dc-b1f6-1fb67f046f2d" containerName="registry-server" Nov 25 16:01:00 crc kubenswrapper[4704]: E1125 16:01:00.146109 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11c7083d-4abd-4a07-99e0-dd218f9a145b" containerName="registry-server" Nov 25 16:01:00 crc kubenswrapper[4704]: I1125 16:01:00.146116 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c7083d-4abd-4a07-99e0-dd218f9a145b" containerName="registry-server" Nov 25 16:01:00 crc kubenswrapper[4704]: I1125 16:01:00.146264 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="11c7083d-4abd-4a07-99e0-dd218f9a145b" containerName="registry-server" Nov 25 16:01:00 crc kubenswrapper[4704]: I1125 16:01:00.146290 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b38583-f22a-45dc-b1f6-1fb67f046f2d" containerName="registry-server" Nov 25 16:01:00 crc kubenswrapper[4704]: I1125 16:01:00.146300 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="48a7441f-b4b6-44e1-9b00-a34e43c4986a" containerName="registry-server" Nov 25 16:01:00 crc kubenswrapper[4704]: I1125 16:01:00.146984 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-cron-29401441-64x6t" Nov 25 16:01:00 crc kubenswrapper[4704]: I1125 16:01:00.153071 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-cron-29401441-64x6t"] Nov 25 16:01:00 crc kubenswrapper[4704]: I1125 16:01:00.272745 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9ntt\" (UniqueName: \"kubernetes.io/projected/021a02df-e360-4c89-8f31-a891a0b3286e-kube-api-access-h9ntt\") pod \"keystone-cron-29401441-64x6t\" (UID: \"021a02df-e360-4c89-8f31-a891a0b3286e\") " pod="glance-kuttl-tests/keystone-cron-29401441-64x6t" Nov 25 16:01:00 crc kubenswrapper[4704]: I1125 16:01:00.272876 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/021a02df-e360-4c89-8f31-a891a0b3286e-fernet-keys\") pod \"keystone-cron-29401441-64x6t\" (UID: \"021a02df-e360-4c89-8f31-a891a0b3286e\") " pod="glance-kuttl-tests/keystone-cron-29401441-64x6t" Nov 25 16:01:00 crc kubenswrapper[4704]: I1125 16:01:00.272914 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/021a02df-e360-4c89-8f31-a891a0b3286e-config-data\") pod \"keystone-cron-29401441-64x6t\" (UID: \"021a02df-e360-4c89-8f31-a891a0b3286e\") " pod="glance-kuttl-tests/keystone-cron-29401441-64x6t" Nov 25 16:01:00 crc kubenswrapper[4704]: I1125 16:01:00.374934 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/021a02df-e360-4c89-8f31-a891a0b3286e-config-data\") pod \"keystone-cron-29401441-64x6t\" (UID: \"021a02df-e360-4c89-8f31-a891a0b3286e\") " pod="glance-kuttl-tests/keystone-cron-29401441-64x6t" Nov 25 16:01:00 crc kubenswrapper[4704]: I1125 16:01:00.375057 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9ntt\" (UniqueName: \"kubernetes.io/projected/021a02df-e360-4c89-8f31-a891a0b3286e-kube-api-access-h9ntt\") pod \"keystone-cron-29401441-64x6t\" (UID: \"021a02df-e360-4c89-8f31-a891a0b3286e\") " pod="glance-kuttl-tests/keystone-cron-29401441-64x6t" Nov 25 16:01:00 crc kubenswrapper[4704]: I1125 16:01:00.375099 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/021a02df-e360-4c89-8f31-a891a0b3286e-fernet-keys\") pod \"keystone-cron-29401441-64x6t\" (UID: \"021a02df-e360-4c89-8f31-a891a0b3286e\") " pod="glance-kuttl-tests/keystone-cron-29401441-64x6t" Nov 25 16:01:00 crc kubenswrapper[4704]: I1125 16:01:00.384014 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/021a02df-e360-4c89-8f31-a891a0b3286e-config-data\") pod \"keystone-cron-29401441-64x6t\" (UID: \"021a02df-e360-4c89-8f31-a891a0b3286e\") " pod="glance-kuttl-tests/keystone-cron-29401441-64x6t" Nov 25 16:01:00 crc kubenswrapper[4704]: I1125 16:01:00.391974 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/021a02df-e360-4c89-8f31-a891a0b3286e-fernet-keys\") pod \"keystone-cron-29401441-64x6t\" (UID: \"021a02df-e360-4c89-8f31-a891a0b3286e\") " pod="glance-kuttl-tests/keystone-cron-29401441-64x6t" Nov 25 16:01:00 crc kubenswrapper[4704]: I1125 16:01:00.392569 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9ntt\" (UniqueName: \"kubernetes.io/projected/021a02df-e360-4c89-8f31-a891a0b3286e-kube-api-access-h9ntt\") pod \"keystone-cron-29401441-64x6t\" (UID: \"021a02df-e360-4c89-8f31-a891a0b3286e\") " pod="glance-kuttl-tests/keystone-cron-29401441-64x6t" Nov 25 16:01:00 crc kubenswrapper[4704]: I1125 16:01:00.467419 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-cron-29401441-64x6t" Nov 25 16:01:00 crc kubenswrapper[4704]: I1125 16:01:00.911712 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-cron-29401441-64x6t"] Nov 25 16:01:01 crc kubenswrapper[4704]: I1125 16:01:01.844595 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-cron-29401441-64x6t" event={"ID":"021a02df-e360-4c89-8f31-a891a0b3286e","Type":"ContainerStarted","Data":"b0c5dcca88580ca7a9834fcb3daabb0831dc8675b80f7a2d7961aa5bf39e9621"} Nov 25 16:01:01 crc kubenswrapper[4704]: I1125 16:01:01.845110 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-cron-29401441-64x6t" event={"ID":"021a02df-e360-4c89-8f31-a891a0b3286e","Type":"ContainerStarted","Data":"0c365bb19c785cea75ad768c62fc37f9448913dd9388f6286114af6eda65e30b"} Nov 25 16:01:01 crc kubenswrapper[4704]: I1125 16:01:01.865027 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-cron-29401441-64x6t" podStartSLOduration=1.864997179 podStartE2EDuration="1.864997179s" podCreationTimestamp="2025-11-25 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:01:01.861783066 +0000 UTC m=+1548.130056847" watchObservedRunningTime="2025-11-25 16:01:01.864997179 +0000 UTC m=+1548.133270970" Nov 25 16:01:03 crc kubenswrapper[4704]: I1125 16:01:03.879874 4704 generic.go:334] "Generic (PLEG): container finished" podID="021a02df-e360-4c89-8f31-a891a0b3286e" containerID="b0c5dcca88580ca7a9834fcb3daabb0831dc8675b80f7a2d7961aa5bf39e9621" exitCode=0 Nov 25 16:01:03 crc kubenswrapper[4704]: I1125 16:01:03.879922 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-cron-29401441-64x6t" event={"ID":"021a02df-e360-4c89-8f31-a891a0b3286e","Type":"ContainerDied","Data":"b0c5dcca88580ca7a9834fcb3daabb0831dc8675b80f7a2d7961aa5bf39e9621"} Nov 25 16:01:05 crc kubenswrapper[4704]: I1125 16:01:05.154692 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-cron-29401441-64x6t" Nov 25 16:01:05 crc kubenswrapper[4704]: I1125 16:01:05.250518 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/021a02df-e360-4c89-8f31-a891a0b3286e-config-data\") pod \"021a02df-e360-4c89-8f31-a891a0b3286e\" (UID: \"021a02df-e360-4c89-8f31-a891a0b3286e\") " Nov 25 16:01:05 crc kubenswrapper[4704]: I1125 16:01:05.250685 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/021a02df-e360-4c89-8f31-a891a0b3286e-fernet-keys\") pod \"021a02df-e360-4c89-8f31-a891a0b3286e\" (UID: \"021a02df-e360-4c89-8f31-a891a0b3286e\") " Nov 25 16:01:05 crc kubenswrapper[4704]: I1125 16:01:05.250775 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9ntt\" (UniqueName: \"kubernetes.io/projected/021a02df-e360-4c89-8f31-a891a0b3286e-kube-api-access-h9ntt\") pod \"021a02df-e360-4c89-8f31-a891a0b3286e\" (UID: \"021a02df-e360-4c89-8f31-a891a0b3286e\") " Nov 25 16:01:05 crc kubenswrapper[4704]: I1125 16:01:05.261169 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/021a02df-e360-4c89-8f31-a891a0b3286e-kube-api-access-h9ntt" (OuterVolumeSpecName: "kube-api-access-h9ntt") pod "021a02df-e360-4c89-8f31-a891a0b3286e" (UID: "021a02df-e360-4c89-8f31-a891a0b3286e"). InnerVolumeSpecName "kube-api-access-h9ntt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:01:05 crc kubenswrapper[4704]: I1125 16:01:05.261223 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/021a02df-e360-4c89-8f31-a891a0b3286e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "021a02df-e360-4c89-8f31-a891a0b3286e" (UID: "021a02df-e360-4c89-8f31-a891a0b3286e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:01:05 crc kubenswrapper[4704]: I1125 16:01:05.292399 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/021a02df-e360-4c89-8f31-a891a0b3286e-config-data" (OuterVolumeSpecName: "config-data") pod "021a02df-e360-4c89-8f31-a891a0b3286e" (UID: "021a02df-e360-4c89-8f31-a891a0b3286e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:01:05 crc kubenswrapper[4704]: I1125 16:01:05.353213 4704 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/021a02df-e360-4c89-8f31-a891a0b3286e-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 25 16:01:05 crc kubenswrapper[4704]: I1125 16:01:05.353260 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9ntt\" (UniqueName: \"kubernetes.io/projected/021a02df-e360-4c89-8f31-a891a0b3286e-kube-api-access-h9ntt\") on node \"crc\" DevicePath \"\"" Nov 25 16:01:05 crc kubenswrapper[4704]: I1125 16:01:05.353275 4704 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/021a02df-e360-4c89-8f31-a891a0b3286e-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:01:05 crc kubenswrapper[4704]: I1125 16:01:05.897297 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-cron-29401441-64x6t" event={"ID":"021a02df-e360-4c89-8f31-a891a0b3286e","Type":"ContainerDied","Data":"0c365bb19c785cea75ad768c62fc37f9448913dd9388f6286114af6eda65e30b"} Nov 25 16:01:05 crc kubenswrapper[4704]: I1125 16:01:05.897350 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c365bb19c785cea75ad768c62fc37f9448913dd9388f6286114af6eda65e30b" Nov 25 16:01:05 crc kubenswrapper[4704]: I1125 16:01:05.897368 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-cron-29401441-64x6t" Nov 25 16:01:42 crc kubenswrapper[4704]: I1125 16:01:42.049779 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-24cf-account-create-update-2429z"] Nov 25 16:01:42 crc kubenswrapper[4704]: I1125 16:01:42.055616 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-db-create-vkffq"] Nov 25 16:01:42 crc kubenswrapper[4704]: I1125 16:01:42.061127 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-24cf-account-create-update-2429z"] Nov 25 16:01:42 crc kubenswrapper[4704]: I1125 16:01:42.066019 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-db-create-vkffq"] Nov 25 16:01:42 crc kubenswrapper[4704]: I1125 16:01:42.424913 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01b4516d-6cb3-47f0-8343-62f39b0d9e52" path="/var/lib/kubelet/pods/01b4516d-6cb3-47f0-8343-62f39b0d9e52/volumes" Nov 25 16:01:42 crc kubenswrapper[4704]: I1125 16:01:42.425806 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c68eff74-0c7f-4150-b600-ffc3293b4e4d" path="/var/lib/kubelet/pods/c68eff74-0c7f-4150-b600-ffc3293b4e4d/volumes" Nov 25 16:02:00 crc kubenswrapper[4704]: I1125 16:02:00.025152 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-k6cq8"] Nov 25 16:02:00 crc kubenswrapper[4704]: I1125 16:02:00.031599 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-k6cq8"] Nov 25 16:02:00 crc kubenswrapper[4704]: I1125 16:02:00.425297 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4240ea8b-01f6-4a52-99e8-f985830dacd9" path="/var/lib/kubelet/pods/4240ea8b-01f6-4a52-99e8-f985830dacd9/volumes" Nov 25 16:02:06 crc kubenswrapper[4704]: I1125 16:02:06.035976 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-ftgpq"] Nov 25 16:02:06 crc kubenswrapper[4704]: I1125 16:02:06.044432 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-ftgpq"] Nov 25 16:02:06 crc kubenswrapper[4704]: I1125 16:02:06.424358 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95561520-558e-4407-9c6c-f76abdf194a7" path="/var/lib/kubelet/pods/95561520-558e-4407-9c6c-f76abdf194a7/volumes" Nov 25 16:02:17 crc kubenswrapper[4704]: I1125 16:02:17.123445 4704 scope.go:117] "RemoveContainer" containerID="2b7df9b4018e475cfbd3a363c2ba12b1a4c5b4b33bd230ea9a0755f18b3ed8a9" Nov 25 16:02:17 crc kubenswrapper[4704]: I1125 16:02:17.178771 4704 scope.go:117] "RemoveContainer" containerID="1fac7abea0b1b3522f33fe7c95ad238db03e6bfb8543443aa0a11d2474e6fe7d" Nov 25 16:02:17 crc kubenswrapper[4704]: I1125 16:02:17.203097 4704 scope.go:117] "RemoveContainer" containerID="cdef7dd544dbfdb0345ab8b5776a2039078d0ed4d4b4b516b7e0e58f6ad92a4f" Nov 25 16:02:17 crc kubenswrapper[4704]: I1125 16:02:17.236692 4704 scope.go:117] "RemoveContainer" containerID="67f51b18809bc1bca8a1c59046482cad9b26e385c77db7991f668f447d5972dc" Nov 25 16:02:17 crc kubenswrapper[4704]: I1125 16:02:17.269614 4704 scope.go:117] "RemoveContainer" containerID="76cbce8b6db02f840316a957d781a02904a4ed2d9f43981b170ec21c24f5569c" Nov 25 16:02:17 crc kubenswrapper[4704]: I1125 16:02:17.297430 4704 scope.go:117] "RemoveContainer" containerID="6694d415abf5b9c4894cc8d83ec1ee7122e2fe70c2d9082ac3a9d8f9702e89fc" Nov 25 16:02:17 crc kubenswrapper[4704]: I1125 16:02:17.343649 4704 scope.go:117] "RemoveContainer" containerID="9344d7f21129ef3164497f4db662d15a6a375a283fc05ff61eb73a31ceda615c" Nov 25 16:02:17 crc kubenswrapper[4704]: I1125 16:02:17.376681 4704 scope.go:117] "RemoveContainer" containerID="3f5f3abd3f3cba5b8fcd0b38d73e7ffa3cd584fe2cf8f637aba9b1e7b8b02eed" Nov 25 16:02:17 crc kubenswrapper[4704]: I1125 16:02:17.393847 4704 scope.go:117] "RemoveContainer" containerID="9405a28c037889ca82a7c5ce253c1599d92264f9345da36c713a826cde0efbfc" Nov 25 16:02:37 crc kubenswrapper[4704]: I1125 16:02:37.963978 4704 patch_prober.go:28] interesting pod/machine-config-daemon-djz8x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:02:37 crc kubenswrapper[4704]: I1125 16:02:37.964857 4704 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.117216 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Nov 25 16:02:39 crc kubenswrapper[4704]: E1125 16:02:39.118011 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="021a02df-e360-4c89-8f31-a891a0b3286e" containerName="keystone-cron" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.118027 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="021a02df-e360-4c89-8f31-a891a0b3286e" containerName="keystone-cron" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.118218 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="021a02df-e360-4c89-8f31-a891a0b3286e" containerName="keystone-cron" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.119250 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.133898 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.240093 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28aa6246-69d0-4ba2-b8d7-90972781c27d-logs\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.240180 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.240207 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.240232 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28aa6246-69d0-4ba2-b8d7-90972781c27d-config-data\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.240291 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.240315 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28aa6246-69d0-4ba2-b8d7-90972781c27d-scripts\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.240344 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn56r\" (UniqueName: \"kubernetes.io/projected/28aa6246-69d0-4ba2-b8d7-90972781c27d-kube-api-access-tn56r\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.240398 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.240451 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-run\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.240498 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.240524 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-sys\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.240556 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-dev\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.240580 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.240614 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/28aa6246-69d0-4ba2-b8d7-90972781c27d-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.256529 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstackclient"] Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.257951 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.260161 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.260200 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"default-dockercfg-sp8pv" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.260363 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts-9db6gc427h" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.260497 4704 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-config-secret" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.262821 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.341534 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.341591 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28aa6246-69d0-4ba2-b8d7-90972781c27d-scripts\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.341628 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn56r\" (UniqueName: \"kubernetes.io/projected/28aa6246-69d0-4ba2-b8d7-90972781c27d-kube-api-access-tn56r\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.341674 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.341698 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-run\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.341730 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.341747 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-sys\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.341768 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-dev\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.341803 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.341834 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/28aa6246-69d0-4ba2-b8d7-90972781c27d-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.341851 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28aa6246-69d0-4ba2-b8d7-90972781c27d-logs\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.341867 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.341881 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.341896 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28aa6246-69d0-4ba2-b8d7-90972781c27d-config-data\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.342192 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.342288 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-sys\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.342319 4704 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.342825 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/28aa6246-69d0-4ba2-b8d7-90972781c27d-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.342338 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-dev\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.343135 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28aa6246-69d0-4ba2-b8d7-90972781c27d-logs\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.343197 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-run\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.343217 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.343207 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.343250 4704 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.343286 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.353643 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28aa6246-69d0-4ba2-b8d7-90972781c27d-scripts\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.362178 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28aa6246-69d0-4ba2-b8d7-90972781c27d-config-data\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.369733 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn56r\" (UniqueName: \"kubernetes.io/projected/28aa6246-69d0-4ba2-b8d7-90972781c27d-kube-api-access-tn56r\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.371550 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.378005 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.443130 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/22e6497e-244f-43ec-be2f-bb10b67a619e-openstack-scripts\") pod \"openstackclient\" (UID: \"22e6497e-244f-43ec-be2f-bb10b67a619e\") " pod="glance-kuttl-tests/openstackclient" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.443653 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/22e6497e-244f-43ec-be2f-bb10b67a619e-openstack-config-secret\") pod \"openstackclient\" (UID: \"22e6497e-244f-43ec-be2f-bb10b67a619e\") " pod="glance-kuttl-tests/openstackclient" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.443718 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwfzg\" (UniqueName: \"kubernetes.io/projected/22e6497e-244f-43ec-be2f-bb10b67a619e-kube-api-access-qwfzg\") pod \"openstackclient\" (UID: \"22e6497e-244f-43ec-be2f-bb10b67a619e\") " pod="glance-kuttl-tests/openstackclient" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.443751 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/22e6497e-244f-43ec-be2f-bb10b67a619e-openstack-config\") pod \"openstackclient\" (UID: \"22e6497e-244f-43ec-be2f-bb10b67a619e\") " pod="glance-kuttl-tests/openstackclient" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.444150 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.545335 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/22e6497e-244f-43ec-be2f-bb10b67a619e-openstack-config\") pod \"openstackclient\" (UID: \"22e6497e-244f-43ec-be2f-bb10b67a619e\") " pod="glance-kuttl-tests/openstackclient" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.545457 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/22e6497e-244f-43ec-be2f-bb10b67a619e-openstack-scripts\") pod \"openstackclient\" (UID: \"22e6497e-244f-43ec-be2f-bb10b67a619e\") " pod="glance-kuttl-tests/openstackclient" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.545504 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/22e6497e-244f-43ec-be2f-bb10b67a619e-openstack-config-secret\") pod \"openstackclient\" (UID: \"22e6497e-244f-43ec-be2f-bb10b67a619e\") " pod="glance-kuttl-tests/openstackclient" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.545575 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwfzg\" (UniqueName: \"kubernetes.io/projected/22e6497e-244f-43ec-be2f-bb10b67a619e-kube-api-access-qwfzg\") pod \"openstackclient\" (UID: \"22e6497e-244f-43ec-be2f-bb10b67a619e\") " pod="glance-kuttl-tests/openstackclient" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.546696 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/22e6497e-244f-43ec-be2f-bb10b67a619e-openstack-config\") pod \"openstackclient\" (UID: \"22e6497e-244f-43ec-be2f-bb10b67a619e\") " pod="glance-kuttl-tests/openstackclient" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.549260 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/22e6497e-244f-43ec-be2f-bb10b67a619e-openstack-scripts\") pod \"openstackclient\" (UID: \"22e6497e-244f-43ec-be2f-bb10b67a619e\") " pod="glance-kuttl-tests/openstackclient" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.553185 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/22e6497e-244f-43ec-be2f-bb10b67a619e-openstack-config-secret\") pod \"openstackclient\" (UID: \"22e6497e-244f-43ec-be2f-bb10b67a619e\") " pod="glance-kuttl-tests/openstackclient" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.563356 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwfzg\" (UniqueName: \"kubernetes.io/projected/22e6497e-244f-43ec-be2f-bb10b67a619e-kube-api-access-qwfzg\") pod \"openstackclient\" (UID: \"22e6497e-244f-43ec-be2f-bb10b67a619e\") " pod="glance-kuttl-tests/openstackclient" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.579076 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Nov 25 16:02:39 crc kubenswrapper[4704]: I1125 16:02:39.887329 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Nov 25 16:02:40 crc kubenswrapper[4704]: I1125 16:02:40.010583 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Nov 25 16:02:40 crc kubenswrapper[4704]: W1125 16:02:40.013226 4704 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22e6497e_244f_43ec_be2f_bb10b67a619e.slice/crio-a56690a614d0924eb02f751d7a7e20aab5391dc996a0d3d493c6fceb5488e6e5 WatchSource:0}: Error finding container a56690a614d0924eb02f751d7a7e20aab5391dc996a0d3d493c6fceb5488e6e5: Status 404 returned error can't find the container with id a56690a614d0924eb02f751d7a7e20aab5391dc996a0d3d493c6fceb5488e6e5 Nov 25 16:02:40 crc kubenswrapper[4704]: I1125 16:02:40.582916 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"22e6497e-244f-43ec-be2f-bb10b67a619e","Type":"ContainerStarted","Data":"f7397873b9c2e424d0efb104507e2edc489d56171164d811afbad367e2186c29"} Nov 25 16:02:40 crc kubenswrapper[4704]: I1125 16:02:40.583844 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"22e6497e-244f-43ec-be2f-bb10b67a619e","Type":"ContainerStarted","Data":"a56690a614d0924eb02f751d7a7e20aab5391dc996a0d3d493c6fceb5488e6e5"} Nov 25 16:02:40 crc kubenswrapper[4704]: I1125 16:02:40.590751 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"28aa6246-69d0-4ba2-b8d7-90972781c27d","Type":"ContainerStarted","Data":"fcfc18c51846c2e7b5dc60b3aabc5eba6dbf8bfee2b4f13dcf2dfcea712213b5"} Nov 25 16:02:40 crc kubenswrapper[4704]: I1125 16:02:40.590824 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"28aa6246-69d0-4ba2-b8d7-90972781c27d","Type":"ContainerStarted","Data":"505fb008d82457ff1d5038dbc3240d3e6e2c9e9c9140f37bb386934b7e9a0d3e"} Nov 25 16:02:40 crc kubenswrapper[4704]: I1125 16:02:40.590835 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"28aa6246-69d0-4ba2-b8d7-90972781c27d","Type":"ContainerStarted","Data":"d94f7f2359b6db901cb864e5edbf470b794d920d271de232f7de85e47fa725ed"} Nov 25 16:02:40 crc kubenswrapper[4704]: I1125 16:02:40.590846 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"28aa6246-69d0-4ba2-b8d7-90972781c27d","Type":"ContainerStarted","Data":"7c7a9a294b8486c6540de557af9d75346c03a4635b0a769d1f77efaf9aa8bfda"} Nov 25 16:02:40 crc kubenswrapper[4704]: I1125 16:02:40.598849 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstackclient" podStartSLOduration=1.598822367 podStartE2EDuration="1.598822367s" podCreationTimestamp="2025-11-25 16:02:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:02:40.595011077 +0000 UTC m=+1646.863284858" watchObservedRunningTime="2025-11-25 16:02:40.598822367 +0000 UTC m=+1646.867096158" Nov 25 16:02:49 crc kubenswrapper[4704]: I1125 16:02:49.444783 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:49 crc kubenswrapper[4704]: I1125 16:02:49.445800 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:49 crc kubenswrapper[4704]: I1125 16:02:49.445817 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:49 crc kubenswrapper[4704]: I1125 16:02:49.472130 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:49 crc kubenswrapper[4704]: I1125 16:02:49.472378 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:49 crc kubenswrapper[4704]: I1125 16:02:49.487176 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:49 crc kubenswrapper[4704]: I1125 16:02:49.497996 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-1" podStartSLOduration=10.497977841 podStartE2EDuration="10.497977841s" podCreationTimestamp="2025-11-25 16:02:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 16:02:40.623294125 +0000 UTC m=+1646.891567906" watchObservedRunningTime="2025-11-25 16:02:49.497977841 +0000 UTC m=+1655.766251622" Nov 25 16:02:49 crc kubenswrapper[4704]: I1125 16:02:49.654606 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:49 crc kubenswrapper[4704]: I1125 16:02:49.654662 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:49 crc kubenswrapper[4704]: I1125 16:02:49.654678 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:49 crc kubenswrapper[4704]: I1125 16:02:49.667292 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:49 crc kubenswrapper[4704]: I1125 16:02:49.668887 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:02:49 crc kubenswrapper[4704]: I1125 16:02:49.669134 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:03:07 crc kubenswrapper[4704]: I1125 16:03:07.964657 4704 patch_prober.go:28] interesting pod/machine-config-daemon-djz8x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:03:07 crc kubenswrapper[4704]: I1125 16:03:07.967425 4704 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:03:37 crc kubenswrapper[4704]: I1125 16:03:37.965055 4704 patch_prober.go:28] interesting pod/machine-config-daemon-djz8x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 16:03:37 crc kubenswrapper[4704]: I1125 16:03:37.966030 4704 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 16:03:37 crc kubenswrapper[4704]: I1125 16:03:37.966092 4704 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" Nov 25 16:03:37 crc kubenswrapper[4704]: I1125 16:03:37.966850 4704 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2d40455a2b5295ff7b13e8f8c3068640929072e4f22451a92dc5d7701185f9fd"} pod="openshift-machine-config-operator/machine-config-daemon-djz8x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 16:03:37 crc kubenswrapper[4704]: I1125 16:03:37.966914 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerName="machine-config-daemon" containerID="cri-o://2d40455a2b5295ff7b13e8f8c3068640929072e4f22451a92dc5d7701185f9fd" gracePeriod=600 Nov 25 16:03:38 crc kubenswrapper[4704]: E1125 16:03:38.090447 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djz8x_openshift-machine-config-operator(91b52682-d008-4b8a-8bc3-26b032d7dc2c)\"" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" Nov 25 16:03:39 crc kubenswrapper[4704]: I1125 16:03:39.002881 4704 generic.go:334] "Generic (PLEG): container finished" podID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" containerID="2d40455a2b5295ff7b13e8f8c3068640929072e4f22451a92dc5d7701185f9fd" exitCode=0 Nov 25 16:03:39 crc kubenswrapper[4704]: I1125 16:03:39.003372 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" event={"ID":"91b52682-d008-4b8a-8bc3-26b032d7dc2c","Type":"ContainerDied","Data":"2d40455a2b5295ff7b13e8f8c3068640929072e4f22451a92dc5d7701185f9fd"} Nov 25 16:03:39 crc kubenswrapper[4704]: I1125 16:03:39.003412 4704 scope.go:117] "RemoveContainer" containerID="7a574ca126e29dffea7335dc4eb45068ba5d1201355a4dba5124c9c343f20921" Nov 25 16:03:39 crc kubenswrapper[4704]: I1125 16:03:39.004013 4704 scope.go:117] "RemoveContainer" containerID="2d40455a2b5295ff7b13e8f8c3068640929072e4f22451a92dc5d7701185f9fd" Nov 25 16:03:39 crc kubenswrapper[4704]: E1125 16:03:39.004270 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djz8x_openshift-machine-config-operator(91b52682-d008-4b8a-8bc3-26b032d7dc2c)\"" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" Nov 25 16:03:47 crc kubenswrapper[4704]: I1125 16:03:47.569854 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Nov 25 16:03:47 crc kubenswrapper[4704]: I1125 16:03:47.571098 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="28aa6246-69d0-4ba2-b8d7-90972781c27d" containerName="glance-log" containerID="cri-o://d94f7f2359b6db901cb864e5edbf470b794d920d271de232f7de85e47fa725ed" gracePeriod=30 Nov 25 16:03:47 crc kubenswrapper[4704]: I1125 16:03:47.571160 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="28aa6246-69d0-4ba2-b8d7-90972781c27d" containerName="glance-api" containerID="cri-o://fcfc18c51846c2e7b5dc60b3aabc5eba6dbf8bfee2b4f13dcf2dfcea712213b5" gracePeriod=30 Nov 25 16:03:47 crc kubenswrapper[4704]: I1125 16:03:47.571183 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="28aa6246-69d0-4ba2-b8d7-90972781c27d" containerName="glance-httpd" containerID="cri-o://505fb008d82457ff1d5038dbc3240d3e6e2c9e9c9140f37bb386934b7e9a0d3e" gracePeriod=30 Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.078039 4704 generic.go:334] "Generic (PLEG): container finished" podID="28aa6246-69d0-4ba2-b8d7-90972781c27d" containerID="505fb008d82457ff1d5038dbc3240d3e6e2c9e9c9140f37bb386934b7e9a0d3e" exitCode=0 Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.078079 4704 generic.go:334] "Generic (PLEG): container finished" podID="28aa6246-69d0-4ba2-b8d7-90972781c27d" containerID="d94f7f2359b6db901cb864e5edbf470b794d920d271de232f7de85e47fa725ed" exitCode=143 Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.078106 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"28aa6246-69d0-4ba2-b8d7-90972781c27d","Type":"ContainerDied","Data":"505fb008d82457ff1d5038dbc3240d3e6e2c9e9c9140f37bb386934b7e9a0d3e"} Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.078136 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"28aa6246-69d0-4ba2-b8d7-90972781c27d","Type":"ContainerDied","Data":"d94f7f2359b6db901cb864e5edbf470b794d920d271de232f7de85e47fa725ed"} Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.379031 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.499595 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-lib-modules\") pod \"28aa6246-69d0-4ba2-b8d7-90972781c27d\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.499673 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-etc-nvme\") pod \"28aa6246-69d0-4ba2-b8d7-90972781c27d\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.499729 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"28aa6246-69d0-4ba2-b8d7-90972781c27d\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.499758 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28aa6246-69d0-4ba2-b8d7-90972781c27d-config-data\") pod \"28aa6246-69d0-4ba2-b8d7-90972781c27d\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.499782 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-dev\") pod \"28aa6246-69d0-4ba2-b8d7-90972781c27d\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.499814 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "28aa6246-69d0-4ba2-b8d7-90972781c27d" (UID: "28aa6246-69d0-4ba2-b8d7-90972781c27d"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.499838 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/28aa6246-69d0-4ba2-b8d7-90972781c27d-httpd-run\") pod \"28aa6246-69d0-4ba2-b8d7-90972781c27d\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.499915 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-run\") pod \"28aa6246-69d0-4ba2-b8d7-90972781c27d\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.499915 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-dev" (OuterVolumeSpecName: "dev") pod "28aa6246-69d0-4ba2-b8d7-90972781c27d" (UID: "28aa6246-69d0-4ba2-b8d7-90972781c27d"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.499959 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28aa6246-69d0-4ba2-b8d7-90972781c27d-scripts\") pod \"28aa6246-69d0-4ba2-b8d7-90972781c27d\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.499992 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-run" (OuterVolumeSpecName: "run") pod "28aa6246-69d0-4ba2-b8d7-90972781c27d" (UID: "28aa6246-69d0-4ba2-b8d7-90972781c27d"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.500013 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28aa6246-69d0-4ba2-b8d7-90972781c27d-logs\") pod \"28aa6246-69d0-4ba2-b8d7-90972781c27d\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.500025 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "28aa6246-69d0-4ba2-b8d7-90972781c27d" (UID: "28aa6246-69d0-4ba2-b8d7-90972781c27d"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.500073 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"28aa6246-69d0-4ba2-b8d7-90972781c27d\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.500105 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-var-locks-brick\") pod \"28aa6246-69d0-4ba2-b8d7-90972781c27d\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.500130 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-sys\") pod \"28aa6246-69d0-4ba2-b8d7-90972781c27d\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.500160 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn56r\" (UniqueName: \"kubernetes.io/projected/28aa6246-69d0-4ba2-b8d7-90972781c27d-kube-api-access-tn56r\") pod \"28aa6246-69d0-4ba2-b8d7-90972781c27d\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.500189 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28aa6246-69d0-4ba2-b8d7-90972781c27d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "28aa6246-69d0-4ba2-b8d7-90972781c27d" (UID: "28aa6246-69d0-4ba2-b8d7-90972781c27d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.500202 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-sys" (OuterVolumeSpecName: "sys") pod "28aa6246-69d0-4ba2-b8d7-90972781c27d" (UID: "28aa6246-69d0-4ba2-b8d7-90972781c27d"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.500202 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "28aa6246-69d0-4ba2-b8d7-90972781c27d" (UID: "28aa6246-69d0-4ba2-b8d7-90972781c27d"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.500221 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "28aa6246-69d0-4ba2-b8d7-90972781c27d" (UID: "28aa6246-69d0-4ba2-b8d7-90972781c27d"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.500199 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-etc-iscsi\") pod \"28aa6246-69d0-4ba2-b8d7-90972781c27d\" (UID: \"28aa6246-69d0-4ba2-b8d7-90972781c27d\") " Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.500548 4704 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.500563 4704 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-dev\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.500571 4704 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/28aa6246-69d0-4ba2-b8d7-90972781c27d-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.500581 4704 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-run\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.500589 4704 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.500598 4704 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-sys\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.500606 4704 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.500614 4704 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/28aa6246-69d0-4ba2-b8d7-90972781c27d-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.504117 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28aa6246-69d0-4ba2-b8d7-90972781c27d-logs" (OuterVolumeSpecName: "logs") pod "28aa6246-69d0-4ba2-b8d7-90972781c27d" (UID: "28aa6246-69d0-4ba2-b8d7-90972781c27d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.511931 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance-cache") pod "28aa6246-69d0-4ba2-b8d7-90972781c27d" (UID: "28aa6246-69d0-4ba2-b8d7-90972781c27d"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.514078 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "28aa6246-69d0-4ba2-b8d7-90972781c27d" (UID: "28aa6246-69d0-4ba2-b8d7-90972781c27d"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.514224 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28aa6246-69d0-4ba2-b8d7-90972781c27d-scripts" (OuterVolumeSpecName: "scripts") pod "28aa6246-69d0-4ba2-b8d7-90972781c27d" (UID: "28aa6246-69d0-4ba2-b8d7-90972781c27d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.514569 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28aa6246-69d0-4ba2-b8d7-90972781c27d-kube-api-access-tn56r" (OuterVolumeSpecName: "kube-api-access-tn56r") pod "28aa6246-69d0-4ba2-b8d7-90972781c27d" (UID: "28aa6246-69d0-4ba2-b8d7-90972781c27d"). InnerVolumeSpecName "kube-api-access-tn56r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.580062 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28aa6246-69d0-4ba2-b8d7-90972781c27d-config-data" (OuterVolumeSpecName: "config-data") pod "28aa6246-69d0-4ba2-b8d7-90972781c27d" (UID: "28aa6246-69d0-4ba2-b8d7-90972781c27d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.602098 4704 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.602135 4704 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28aa6246-69d0-4ba2-b8d7-90972781c27d-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.602145 4704 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28aa6246-69d0-4ba2-b8d7-90972781c27d-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.602155 4704 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28aa6246-69d0-4ba2-b8d7-90972781c27d-logs\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.602169 4704 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.602178 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn56r\" (UniqueName: \"kubernetes.io/projected/28aa6246-69d0-4ba2-b8d7-90972781c27d-kube-api-access-tn56r\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.615612 4704 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.615882 4704 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.704701 4704 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:48 crc kubenswrapper[4704]: I1125 16:03:48.704750 4704 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.047272 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-mcw5g"] Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.053191 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-mcw5g"] Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.087720 4704 generic.go:334] "Generic (PLEG): container finished" podID="28aa6246-69d0-4ba2-b8d7-90972781c27d" containerID="fcfc18c51846c2e7b5dc60b3aabc5eba6dbf8bfee2b4f13dcf2dfcea712213b5" exitCode=0 Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.087773 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"28aa6246-69d0-4ba2-b8d7-90972781c27d","Type":"ContainerDied","Data":"fcfc18c51846c2e7b5dc60b3aabc5eba6dbf8bfee2b4f13dcf2dfcea712213b5"} Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.087829 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"28aa6246-69d0-4ba2-b8d7-90972781c27d","Type":"ContainerDied","Data":"7c7a9a294b8486c6540de557af9d75346c03a4635b0a769d1f77efaf9aa8bfda"} Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.087851 4704 scope.go:117] "RemoveContainer" containerID="fcfc18c51846c2e7b5dc60b3aabc5eba6dbf8bfee2b4f13dcf2dfcea712213b5" Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.087884 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.088741 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance90ee-account-delete-4ft46"] Nov 25 16:03:49 crc kubenswrapper[4704]: E1125 16:03:49.089041 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28aa6246-69d0-4ba2-b8d7-90972781c27d" containerName="glance-httpd" Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.089061 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="28aa6246-69d0-4ba2-b8d7-90972781c27d" containerName="glance-httpd" Nov 25 16:03:49 crc kubenswrapper[4704]: E1125 16:03:49.089079 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28aa6246-69d0-4ba2-b8d7-90972781c27d" containerName="glance-log" Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.089087 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="28aa6246-69d0-4ba2-b8d7-90972781c27d" containerName="glance-log" Nov 25 16:03:49 crc kubenswrapper[4704]: E1125 16:03:49.089107 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28aa6246-69d0-4ba2-b8d7-90972781c27d" containerName="glance-api" Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.089113 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="28aa6246-69d0-4ba2-b8d7-90972781c27d" containerName="glance-api" Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.089221 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="28aa6246-69d0-4ba2-b8d7-90972781c27d" containerName="glance-api" Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.089237 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="28aa6246-69d0-4ba2-b8d7-90972781c27d" containerName="glance-httpd" Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.089257 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="28aa6246-69d0-4ba2-b8d7-90972781c27d" containerName="glance-log" Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.089710 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance90ee-account-delete-4ft46" Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.110161 4704 scope.go:117] "RemoveContainer" containerID="505fb008d82457ff1d5038dbc3240d3e6e2c9e9c9140f37bb386934b7e9a0d3e" Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.118984 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance90ee-account-delete-4ft46"] Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.131176 4704 scope.go:117] "RemoveContainer" containerID="d94f7f2359b6db901cb864e5edbf470b794d920d271de232f7de85e47fa725ed" Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.141991 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.146847 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.148190 4704 scope.go:117] "RemoveContainer" containerID="fcfc18c51846c2e7b5dc60b3aabc5eba6dbf8bfee2b4f13dcf2dfcea712213b5" Nov 25 16:03:49 crc kubenswrapper[4704]: E1125 16:03:49.148706 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcfc18c51846c2e7b5dc60b3aabc5eba6dbf8bfee2b4f13dcf2dfcea712213b5\": container with ID starting with fcfc18c51846c2e7b5dc60b3aabc5eba6dbf8bfee2b4f13dcf2dfcea712213b5 not found: ID does not exist" containerID="fcfc18c51846c2e7b5dc60b3aabc5eba6dbf8bfee2b4f13dcf2dfcea712213b5" Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.148816 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcfc18c51846c2e7b5dc60b3aabc5eba6dbf8bfee2b4f13dcf2dfcea712213b5"} err="failed to get container status \"fcfc18c51846c2e7b5dc60b3aabc5eba6dbf8bfee2b4f13dcf2dfcea712213b5\": rpc error: code = NotFound desc = could not find container \"fcfc18c51846c2e7b5dc60b3aabc5eba6dbf8bfee2b4f13dcf2dfcea712213b5\": container with ID starting with fcfc18c51846c2e7b5dc60b3aabc5eba6dbf8bfee2b4f13dcf2dfcea712213b5 not found: ID does not exist" Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.148934 4704 scope.go:117] "RemoveContainer" containerID="505fb008d82457ff1d5038dbc3240d3e6e2c9e9c9140f37bb386934b7e9a0d3e" Nov 25 16:03:49 crc kubenswrapper[4704]: E1125 16:03:49.149359 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"505fb008d82457ff1d5038dbc3240d3e6e2c9e9c9140f37bb386934b7e9a0d3e\": container with ID starting with 505fb008d82457ff1d5038dbc3240d3e6e2c9e9c9140f37bb386934b7e9a0d3e not found: ID does not exist" containerID="505fb008d82457ff1d5038dbc3240d3e6e2c9e9c9140f37bb386934b7e9a0d3e" Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.149456 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"505fb008d82457ff1d5038dbc3240d3e6e2c9e9c9140f37bb386934b7e9a0d3e"} err="failed to get container status \"505fb008d82457ff1d5038dbc3240d3e6e2c9e9c9140f37bb386934b7e9a0d3e\": rpc error: code = NotFound desc = could not find container \"505fb008d82457ff1d5038dbc3240d3e6e2c9e9c9140f37bb386934b7e9a0d3e\": container with ID starting with 505fb008d82457ff1d5038dbc3240d3e6e2c9e9c9140f37bb386934b7e9a0d3e not found: ID does not exist" Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.149531 4704 scope.go:117] "RemoveContainer" containerID="d94f7f2359b6db901cb864e5edbf470b794d920d271de232f7de85e47fa725ed" Nov 25 16:03:49 crc kubenswrapper[4704]: E1125 16:03:49.149856 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d94f7f2359b6db901cb864e5edbf470b794d920d271de232f7de85e47fa725ed\": container with ID starting with d94f7f2359b6db901cb864e5edbf470b794d920d271de232f7de85e47fa725ed not found: ID does not exist" containerID="d94f7f2359b6db901cb864e5edbf470b794d920d271de232f7de85e47fa725ed" Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.149910 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d94f7f2359b6db901cb864e5edbf470b794d920d271de232f7de85e47fa725ed"} err="failed to get container status \"d94f7f2359b6db901cb864e5edbf470b794d920d271de232f7de85e47fa725ed\": rpc error: code = NotFound desc = could not find container \"d94f7f2359b6db901cb864e5edbf470b794d920d271de232f7de85e47fa725ed\": container with ID starting with d94f7f2359b6db901cb864e5edbf470b794d920d271de232f7de85e47fa725ed not found: ID does not exist" Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.212407 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlcnf\" (UniqueName: \"kubernetes.io/projected/ba64e013-8b1b-4082-9ca9-50b24a53362f-kube-api-access-qlcnf\") pod \"glance90ee-account-delete-4ft46\" (UID: \"ba64e013-8b1b-4082-9ca9-50b24a53362f\") " pod="glance-kuttl-tests/glance90ee-account-delete-4ft46" Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.212522 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba64e013-8b1b-4082-9ca9-50b24a53362f-operator-scripts\") pod \"glance90ee-account-delete-4ft46\" (UID: \"ba64e013-8b1b-4082-9ca9-50b24a53362f\") " pod="glance-kuttl-tests/glance90ee-account-delete-4ft46" Nov 25 16:03:49 crc kubenswrapper[4704]: E1125 16:03:49.227127 4704 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28aa6246_69d0_4ba2_b8d7_90972781c27d.slice\": RecentStats: unable to find data in memory cache]" Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.314066 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba64e013-8b1b-4082-9ca9-50b24a53362f-operator-scripts\") pod \"glance90ee-account-delete-4ft46\" (UID: \"ba64e013-8b1b-4082-9ca9-50b24a53362f\") " pod="glance-kuttl-tests/glance90ee-account-delete-4ft46" Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.314152 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlcnf\" (UniqueName: \"kubernetes.io/projected/ba64e013-8b1b-4082-9ca9-50b24a53362f-kube-api-access-qlcnf\") pod \"glance90ee-account-delete-4ft46\" (UID: \"ba64e013-8b1b-4082-9ca9-50b24a53362f\") " pod="glance-kuttl-tests/glance90ee-account-delete-4ft46" Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.314982 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba64e013-8b1b-4082-9ca9-50b24a53362f-operator-scripts\") pod \"glance90ee-account-delete-4ft46\" (UID: \"ba64e013-8b1b-4082-9ca9-50b24a53362f\") " pod="glance-kuttl-tests/glance90ee-account-delete-4ft46" Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.333002 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlcnf\" (UniqueName: \"kubernetes.io/projected/ba64e013-8b1b-4082-9ca9-50b24a53362f-kube-api-access-qlcnf\") pod \"glance90ee-account-delete-4ft46\" (UID: \"ba64e013-8b1b-4082-9ca9-50b24a53362f\") " pod="glance-kuttl-tests/glance90ee-account-delete-4ft46" Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.416654 4704 scope.go:117] "RemoveContainer" containerID="2d40455a2b5295ff7b13e8f8c3068640929072e4f22451a92dc5d7701185f9fd" Nov 25 16:03:49 crc kubenswrapper[4704]: E1125 16:03:49.416920 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djz8x_openshift-machine-config-operator(91b52682-d008-4b8a-8bc3-26b032d7dc2c)\"" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.419128 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance90ee-account-delete-4ft46" Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.424055 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.424320 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="04d98209-2337-49da-a0ac-1f12810f5fb3" containerName="glance-log" containerID="cri-o://980a3e14be2bfb287117cc886bb023ebabe2cf0486b5beb817984b29a900b78e" gracePeriod=30 Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.424381 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="04d98209-2337-49da-a0ac-1f12810f5fb3" containerName="glance-api" containerID="cri-o://23d64f60f78a194e9f0b84c9ff9aabfe6cdaf860a0c77e90cfbd93a1690077e8" gracePeriod=30 Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.424441 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="04d98209-2337-49da-a0ac-1f12810f5fb3" containerName="glance-httpd" containerID="cri-o://afd5aae978df19dba491f861e9b57fa98b42f5fb2cec66340ac2741bc94a4b7b" gracePeriod=30 Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.447134 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz"] Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.458202 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-external-api-0-cleaner-2940144t6dhz"] Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.510546 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.510857 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="d4e3b666-6607-432c-9274-a75ba8716911" containerName="glance-log" containerID="cri-o://f87011cb44ef6ac465628839c9297b4a26dfe8381366f2b0d1bcbac539cc8299" gracePeriod=30 Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.511242 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="d4e3b666-6607-432c-9274-a75ba8716911" containerName="glance-httpd" containerID="cri-o://82cf845e3515098866a1241ef9bceeea60278a66c29144abeda36839f973d258" gracePeriod=30 Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.511266 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="d4e3b666-6607-432c-9274-a75ba8716911" containerName="glance-api" containerID="cri-o://338f08b9726471fdea676b2cb113909d8cd04f6e543de7b22ccb05d0a0ee1f1f" gracePeriod=30 Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.532841 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg"] Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.545967 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-cache-glance-default-internal-api-0-cleaner-2940144bmwlg"] Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.733923 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance90ee-account-delete-4ft46"] Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.795270 4704 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="04d98209-2337-49da-a0ac-1f12810f5fb3" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.116:9292/healthcheck\": read tcp 10.217.0.2:50146->10.217.0.116:9292: read: connection reset by peer" Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.795270 4704 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="04d98209-2337-49da-a0ac-1f12810f5fb3" containerName="glance-api" probeResult="failure" output="Get \"http://10.217.0.116:9292/healthcheck\": read tcp 10.217.0.2:50154->10.217.0.116:9292: read: connection reset by peer" Nov 25 16:03:49 crc kubenswrapper[4704]: I1125 16:03:49.795248 4704 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="04d98209-2337-49da-a0ac-1f12810f5fb3" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.0.116:9292/healthcheck\": read tcp 10.217.0.2:50130->10.217.0.116:9292: read: connection reset by peer" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.102672 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance90ee-account-delete-4ft46" event={"ID":"ba64e013-8b1b-4082-9ca9-50b24a53362f","Type":"ContainerStarted","Data":"2e68718120309392a3cabb40ebe12fb003696fcbc94457a6eb9c60dc50323dc1"} Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.103112 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance90ee-account-delete-4ft46" event={"ID":"ba64e013-8b1b-4082-9ca9-50b24a53362f","Type":"ContainerStarted","Data":"e3b85324da8593b3ba53f92c9fc721bb113b71911d2606c430f2c73af9be5925"} Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.108300 4704 generic.go:334] "Generic (PLEG): container finished" podID="04d98209-2337-49da-a0ac-1f12810f5fb3" containerID="afd5aae978df19dba491f861e9b57fa98b42f5fb2cec66340ac2741bc94a4b7b" exitCode=0 Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.108334 4704 generic.go:334] "Generic (PLEG): container finished" podID="04d98209-2337-49da-a0ac-1f12810f5fb3" containerID="980a3e14be2bfb287117cc886bb023ebabe2cf0486b5beb817984b29a900b78e" exitCode=143 Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.108387 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"04d98209-2337-49da-a0ac-1f12810f5fb3","Type":"ContainerDied","Data":"afd5aae978df19dba491f861e9b57fa98b42f5fb2cec66340ac2741bc94a4b7b"} Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.108426 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"04d98209-2337-49da-a0ac-1f12810f5fb3","Type":"ContainerDied","Data":"980a3e14be2bfb287117cc886bb023ebabe2cf0486b5beb817984b29a900b78e"} Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.112189 4704 generic.go:334] "Generic (PLEG): container finished" podID="d4e3b666-6607-432c-9274-a75ba8716911" containerID="82cf845e3515098866a1241ef9bceeea60278a66c29144abeda36839f973d258" exitCode=0 Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.112213 4704 generic.go:334] "Generic (PLEG): container finished" podID="d4e3b666-6607-432c-9274-a75ba8716911" containerID="f87011cb44ef6ac465628839c9297b4a26dfe8381366f2b0d1bcbac539cc8299" exitCode=143 Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.112236 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"d4e3b666-6607-432c-9274-a75ba8716911","Type":"ContainerDied","Data":"82cf845e3515098866a1241ef9bceeea60278a66c29144abeda36839f973d258"} Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.112257 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"d4e3b666-6607-432c-9274-a75ba8716911","Type":"ContainerDied","Data":"f87011cb44ef6ac465628839c9297b4a26dfe8381366f2b0d1bcbac539cc8299"} Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.387393 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.426336 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28aa6246-69d0-4ba2-b8d7-90972781c27d" path="/var/lib/kubelet/pods/28aa6246-69d0-4ba2-b8d7-90972781c27d/volumes" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.427757 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45f5c1d5-2ab3-46ef-89f4-2447584718e2" path="/var/lib/kubelet/pods/45f5c1d5-2ab3-46ef-89f4-2447584718e2/volumes" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.428687 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d79d8223-8a36-4af2-84ac-ce2d4c024eb4" path="/var/lib/kubelet/pods/d79d8223-8a36-4af2-84ac-ce2d4c024eb4/volumes" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.430119 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcae4c02-2b13-40ea-b7d3-0dba85d8e793" path="/var/lib/kubelet/pods/dcae4c02-2b13-40ea-b7d3-0dba85d8e793/volumes" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.466491 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.541248 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d4e3b666-6607-432c-9274-a75ba8716911-httpd-run\") pod \"d4e3b666-6607-432c-9274-a75ba8716911\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.541341 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-sys\") pod \"d4e3b666-6607-432c-9274-a75ba8716911\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.541365 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-etc-nvme\") pod \"d4e3b666-6607-432c-9274-a75ba8716911\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.541381 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-dev\") pod \"d4e3b666-6607-432c-9274-a75ba8716911\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.541423 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-lib-modules\") pod \"d4e3b666-6607-432c-9274-a75ba8716911\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.541458 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-run\") pod \"d4e3b666-6607-432c-9274-a75ba8716911\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.541478 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-etc-iscsi\") pod \"d4e3b666-6607-432c-9274-a75ba8716911\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.541502 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dccfr\" (UniqueName: \"kubernetes.io/projected/d4e3b666-6607-432c-9274-a75ba8716911-kube-api-access-dccfr\") pod \"d4e3b666-6607-432c-9274-a75ba8716911\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.541521 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4e3b666-6607-432c-9274-a75ba8716911-scripts\") pod \"d4e3b666-6607-432c-9274-a75ba8716911\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.541537 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-var-locks-brick\") pod \"d4e3b666-6607-432c-9274-a75ba8716911\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.541573 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "d4e3b666-6607-432c-9274-a75ba8716911" (UID: "d4e3b666-6607-432c-9274-a75ba8716911"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.541590 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"d4e3b666-6607-432c-9274-a75ba8716911\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.541631 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-sys" (OuterVolumeSpecName: "sys") pod "d4e3b666-6607-432c-9274-a75ba8716911" (UID: "d4e3b666-6607-432c-9274-a75ba8716911"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.541676 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "d4e3b666-6607-432c-9274-a75ba8716911" (UID: "d4e3b666-6607-432c-9274-a75ba8716911"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.541691 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4e3b666-6607-432c-9274-a75ba8716911-logs\") pod \"d4e3b666-6607-432c-9274-a75ba8716911\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.541811 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"d4e3b666-6607-432c-9274-a75ba8716911\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.541846 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4e3b666-6607-432c-9274-a75ba8716911-config-data\") pod \"d4e3b666-6607-432c-9274-a75ba8716911\" (UID: \"d4e3b666-6607-432c-9274-a75ba8716911\") " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.542363 4704 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.542385 4704 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-sys\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.542396 4704 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.541636 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4e3b666-6607-432c-9274-a75ba8716911-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d4e3b666-6607-432c-9274-a75ba8716911" (UID: "d4e3b666-6607-432c-9274-a75ba8716911"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.541718 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-dev" (OuterVolumeSpecName: "dev") pod "d4e3b666-6607-432c-9274-a75ba8716911" (UID: "d4e3b666-6607-432c-9274-a75ba8716911"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.541732 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "d4e3b666-6607-432c-9274-a75ba8716911" (UID: "d4e3b666-6607-432c-9274-a75ba8716911"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.541745 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-run" (OuterVolumeSpecName: "run") pod "d4e3b666-6607-432c-9274-a75ba8716911" (UID: "d4e3b666-6607-432c-9274-a75ba8716911"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.541764 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "d4e3b666-6607-432c-9274-a75ba8716911" (UID: "d4e3b666-6607-432c-9274-a75ba8716911"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.542012 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4e3b666-6607-432c-9274-a75ba8716911-logs" (OuterVolumeSpecName: "logs") pod "d4e3b666-6607-432c-9274-a75ba8716911" (UID: "d4e3b666-6607-432c-9274-a75ba8716911"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.548204 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "d4e3b666-6607-432c-9274-a75ba8716911" (UID: "d4e3b666-6607-432c-9274-a75ba8716911"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.548376 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance-cache") pod "d4e3b666-6607-432c-9274-a75ba8716911" (UID: "d4e3b666-6607-432c-9274-a75ba8716911"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.548540 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4e3b666-6607-432c-9274-a75ba8716911-kube-api-access-dccfr" (OuterVolumeSpecName: "kube-api-access-dccfr") pod "d4e3b666-6607-432c-9274-a75ba8716911" (UID: "d4e3b666-6607-432c-9274-a75ba8716911"). InnerVolumeSpecName "kube-api-access-dccfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.549475 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4e3b666-6607-432c-9274-a75ba8716911-scripts" (OuterVolumeSpecName: "scripts") pod "d4e3b666-6607-432c-9274-a75ba8716911" (UID: "d4e3b666-6607-432c-9274-a75ba8716911"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.615651 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4e3b666-6607-432c-9274-a75ba8716911-config-data" (OuterVolumeSpecName: "config-data") pod "d4e3b666-6607-432c-9274-a75ba8716911" (UID: "d4e3b666-6607-432c-9274-a75ba8716911"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.643744 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04d98209-2337-49da-a0ac-1f12810f5fb3-logs\") pod \"04d98209-2337-49da-a0ac-1f12810f5fb3\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.643810 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-etc-iscsi\") pod \"04d98209-2337-49da-a0ac-1f12810f5fb3\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.643931 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/04d98209-2337-49da-a0ac-1f12810f5fb3-httpd-run\") pod \"04d98209-2337-49da-a0ac-1f12810f5fb3\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.643977 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-var-locks-brick\") pod \"04d98209-2337-49da-a0ac-1f12810f5fb3\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.644022 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-lib-modules\") pod \"04d98209-2337-49da-a0ac-1f12810f5fb3\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.644067 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fshqz\" (UniqueName: \"kubernetes.io/projected/04d98209-2337-49da-a0ac-1f12810f5fb3-kube-api-access-fshqz\") pod \"04d98209-2337-49da-a0ac-1f12810f5fb3\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.644083 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04d98209-2337-49da-a0ac-1f12810f5fb3-scripts\") pod \"04d98209-2337-49da-a0ac-1f12810f5fb3\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.644118 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"04d98209-2337-49da-a0ac-1f12810f5fb3\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.644150 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04d98209-2337-49da-a0ac-1f12810f5fb3-config-data\") pod \"04d98209-2337-49da-a0ac-1f12810f5fb3\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.644179 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"04d98209-2337-49da-a0ac-1f12810f5fb3\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.644194 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-run\") pod \"04d98209-2337-49da-a0ac-1f12810f5fb3\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.644212 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-dev\") pod \"04d98209-2337-49da-a0ac-1f12810f5fb3\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.644226 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-sys\") pod \"04d98209-2337-49da-a0ac-1f12810f5fb3\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.644242 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-etc-nvme\") pod \"04d98209-2337-49da-a0ac-1f12810f5fb3\" (UID: \"04d98209-2337-49da-a0ac-1f12810f5fb3\") " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.644502 4704 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.644494 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04d98209-2337-49da-a0ac-1f12810f5fb3-logs" (OuterVolumeSpecName: "logs") pod "04d98209-2337-49da-a0ac-1f12810f5fb3" (UID: "04d98209-2337-49da-a0ac-1f12810f5fb3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.644514 4704 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-run\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.644555 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "04d98209-2337-49da-a0ac-1f12810f5fb3" (UID: "04d98209-2337-49da-a0ac-1f12810f5fb3"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.644574 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dccfr\" (UniqueName: \"kubernetes.io/projected/d4e3b666-6607-432c-9274-a75ba8716911-kube-api-access-dccfr\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.644583 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "04d98209-2337-49da-a0ac-1f12810f5fb3" (UID: "04d98209-2337-49da-a0ac-1f12810f5fb3"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.644587 4704 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4e3b666-6607-432c-9274-a75ba8716911-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.644604 4704 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.644624 4704 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.644638 4704 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4e3b666-6607-432c-9274-a75ba8716911-logs\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.644660 4704 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.644670 4704 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4e3b666-6607-432c-9274-a75ba8716911-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.644679 4704 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d4e3b666-6607-432c-9274-a75ba8716911-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.644687 4704 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d4e3b666-6607-432c-9274-a75ba8716911-dev\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.645191 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-run" (OuterVolumeSpecName: "run") pod "04d98209-2337-49da-a0ac-1f12810f5fb3" (UID: "04d98209-2337-49da-a0ac-1f12810f5fb3"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.645341 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-sys" (OuterVolumeSpecName: "sys") pod "04d98209-2337-49da-a0ac-1f12810f5fb3" (UID: "04d98209-2337-49da-a0ac-1f12810f5fb3"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.645372 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-dev" (OuterVolumeSpecName: "dev") pod "04d98209-2337-49da-a0ac-1f12810f5fb3" (UID: "04d98209-2337-49da-a0ac-1f12810f5fb3"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.645390 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "04d98209-2337-49da-a0ac-1f12810f5fb3" (UID: "04d98209-2337-49da-a0ac-1f12810f5fb3"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.645538 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04d98209-2337-49da-a0ac-1f12810f5fb3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "04d98209-2337-49da-a0ac-1f12810f5fb3" (UID: "04d98209-2337-49da-a0ac-1f12810f5fb3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.646086 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "04d98209-2337-49da-a0ac-1f12810f5fb3" (UID: "04d98209-2337-49da-a0ac-1f12810f5fb3"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.649062 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "04d98209-2337-49da-a0ac-1f12810f5fb3" (UID: "04d98209-2337-49da-a0ac-1f12810f5fb3"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.649775 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04d98209-2337-49da-a0ac-1f12810f5fb3-scripts" (OuterVolumeSpecName: "scripts") pod "04d98209-2337-49da-a0ac-1f12810f5fb3" (UID: "04d98209-2337-49da-a0ac-1f12810f5fb3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.655089 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04d98209-2337-49da-a0ac-1f12810f5fb3-kube-api-access-fshqz" (OuterVolumeSpecName: "kube-api-access-fshqz") pod "04d98209-2337-49da-a0ac-1f12810f5fb3" (UID: "04d98209-2337-49da-a0ac-1f12810f5fb3"). InnerVolumeSpecName "kube-api-access-fshqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.659449 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance-cache") pod "04d98209-2337-49da-a0ac-1f12810f5fb3" (UID: "04d98209-2337-49da-a0ac-1f12810f5fb3"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.660887 4704 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.665642 4704 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.708523 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04d98209-2337-49da-a0ac-1f12810f5fb3-config-data" (OuterVolumeSpecName: "config-data") pod "04d98209-2337-49da-a0ac-1f12810f5fb3" (UID: "04d98209-2337-49da-a0ac-1f12810f5fb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.745846 4704 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.747046 4704 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-run\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.747132 4704 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-dev\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.747212 4704 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-sys\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.747278 4704 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-etc-nvme\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.747340 4704 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04d98209-2337-49da-a0ac-1f12810f5fb3-logs\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.747401 4704 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.747471 4704 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-etc-iscsi\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.747533 4704 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/04d98209-2337-49da-a0ac-1f12810f5fb3-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.747591 4704 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-var-locks-brick\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.747650 4704 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.747723 4704 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/04d98209-2337-49da-a0ac-1f12810f5fb3-lib-modules\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.747800 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fshqz\" (UniqueName: \"kubernetes.io/projected/04d98209-2337-49da-a0ac-1f12810f5fb3-kube-api-access-fshqz\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.747883 4704 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04d98209-2337-49da-a0ac-1f12810f5fb3-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.747984 4704 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.748080 4704 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04d98209-2337-49da-a0ac-1f12810f5fb3-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.758650 4704 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.768909 4704 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.849561 4704 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:50 crc kubenswrapper[4704]: I1125 16:03:50.849610 4704 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.122151 4704 generic.go:334] "Generic (PLEG): container finished" podID="ba64e013-8b1b-4082-9ca9-50b24a53362f" containerID="2e68718120309392a3cabb40ebe12fb003696fcbc94457a6eb9c60dc50323dc1" exitCode=0 Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.122279 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance90ee-account-delete-4ft46" event={"ID":"ba64e013-8b1b-4082-9ca9-50b24a53362f","Type":"ContainerDied","Data":"2e68718120309392a3cabb40ebe12fb003696fcbc94457a6eb9c60dc50323dc1"} Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.125058 4704 generic.go:334] "Generic (PLEG): container finished" podID="04d98209-2337-49da-a0ac-1f12810f5fb3" containerID="23d64f60f78a194e9f0b84c9ff9aabfe6cdaf860a0c77e90cfbd93a1690077e8" exitCode=0 Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.125126 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.125168 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"04d98209-2337-49da-a0ac-1f12810f5fb3","Type":"ContainerDied","Data":"23d64f60f78a194e9f0b84c9ff9aabfe6cdaf860a0c77e90cfbd93a1690077e8"} Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.125206 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"04d98209-2337-49da-a0ac-1f12810f5fb3","Type":"ContainerDied","Data":"24b0f1a6b39f617b8a84cee86f3a8cd8277b492d6a08e87116a959e392a25d6a"} Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.125228 4704 scope.go:117] "RemoveContainer" containerID="23d64f60f78a194e9f0b84c9ff9aabfe6cdaf860a0c77e90cfbd93a1690077e8" Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.128893 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.128935 4704 generic.go:334] "Generic (PLEG): container finished" podID="d4e3b666-6607-432c-9274-a75ba8716911" containerID="338f08b9726471fdea676b2cb113909d8cd04f6e543de7b22ccb05d0a0ee1f1f" exitCode=0 Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.128976 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"d4e3b666-6607-432c-9274-a75ba8716911","Type":"ContainerDied","Data":"338f08b9726471fdea676b2cb113909d8cd04f6e543de7b22ccb05d0a0ee1f1f"} Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.129021 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"d4e3b666-6607-432c-9274-a75ba8716911","Type":"ContainerDied","Data":"27c0de1a36d2a8d290d00de42aa3fc8959ffa876c9c4181c69898b0da75c1d52"} Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.165210 4704 scope.go:117] "RemoveContainer" containerID="afd5aae978df19dba491f861e9b57fa98b42f5fb2cec66340ac2741bc94a4b7b" Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.181330 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.189309 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.198628 4704 scope.go:117] "RemoveContainer" containerID="980a3e14be2bfb287117cc886bb023ebabe2cf0486b5beb817984b29a900b78e" Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.199519 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.204679 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.214327 4704 scope.go:117] "RemoveContainer" containerID="23d64f60f78a194e9f0b84c9ff9aabfe6cdaf860a0c77e90cfbd93a1690077e8" Nov 25 16:03:51 crc kubenswrapper[4704]: E1125 16:03:51.214718 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23d64f60f78a194e9f0b84c9ff9aabfe6cdaf860a0c77e90cfbd93a1690077e8\": container with ID starting with 23d64f60f78a194e9f0b84c9ff9aabfe6cdaf860a0c77e90cfbd93a1690077e8 not found: ID does not exist" containerID="23d64f60f78a194e9f0b84c9ff9aabfe6cdaf860a0c77e90cfbd93a1690077e8" Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.214749 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23d64f60f78a194e9f0b84c9ff9aabfe6cdaf860a0c77e90cfbd93a1690077e8"} err="failed to get container status \"23d64f60f78a194e9f0b84c9ff9aabfe6cdaf860a0c77e90cfbd93a1690077e8\": rpc error: code = NotFound desc = could not find container \"23d64f60f78a194e9f0b84c9ff9aabfe6cdaf860a0c77e90cfbd93a1690077e8\": container with ID starting with 23d64f60f78a194e9f0b84c9ff9aabfe6cdaf860a0c77e90cfbd93a1690077e8 not found: ID does not exist" Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.214769 4704 scope.go:117] "RemoveContainer" containerID="afd5aae978df19dba491f861e9b57fa98b42f5fb2cec66340ac2741bc94a4b7b" Nov 25 16:03:51 crc kubenswrapper[4704]: E1125 16:03:51.215159 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afd5aae978df19dba491f861e9b57fa98b42f5fb2cec66340ac2741bc94a4b7b\": container with ID starting with afd5aae978df19dba491f861e9b57fa98b42f5fb2cec66340ac2741bc94a4b7b not found: ID does not exist" containerID="afd5aae978df19dba491f861e9b57fa98b42f5fb2cec66340ac2741bc94a4b7b" Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.215202 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afd5aae978df19dba491f861e9b57fa98b42f5fb2cec66340ac2741bc94a4b7b"} err="failed to get container status \"afd5aae978df19dba491f861e9b57fa98b42f5fb2cec66340ac2741bc94a4b7b\": rpc error: code = NotFound desc = could not find container \"afd5aae978df19dba491f861e9b57fa98b42f5fb2cec66340ac2741bc94a4b7b\": container with ID starting with afd5aae978df19dba491f861e9b57fa98b42f5fb2cec66340ac2741bc94a4b7b not found: ID does not exist" Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.215252 4704 scope.go:117] "RemoveContainer" containerID="980a3e14be2bfb287117cc886bb023ebabe2cf0486b5beb817984b29a900b78e" Nov 25 16:03:51 crc kubenswrapper[4704]: E1125 16:03:51.215585 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"980a3e14be2bfb287117cc886bb023ebabe2cf0486b5beb817984b29a900b78e\": container with ID starting with 980a3e14be2bfb287117cc886bb023ebabe2cf0486b5beb817984b29a900b78e not found: ID does not exist" containerID="980a3e14be2bfb287117cc886bb023ebabe2cf0486b5beb817984b29a900b78e" Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.215619 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"980a3e14be2bfb287117cc886bb023ebabe2cf0486b5beb817984b29a900b78e"} err="failed to get container status \"980a3e14be2bfb287117cc886bb023ebabe2cf0486b5beb817984b29a900b78e\": rpc error: code = NotFound desc = could not find container \"980a3e14be2bfb287117cc886bb023ebabe2cf0486b5beb817984b29a900b78e\": container with ID starting with 980a3e14be2bfb287117cc886bb023ebabe2cf0486b5beb817984b29a900b78e not found: ID does not exist" Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.215662 4704 scope.go:117] "RemoveContainer" containerID="338f08b9726471fdea676b2cb113909d8cd04f6e543de7b22ccb05d0a0ee1f1f" Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.244151 4704 scope.go:117] "RemoveContainer" containerID="82cf845e3515098866a1241ef9bceeea60278a66c29144abeda36839f973d258" Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.266000 4704 scope.go:117] "RemoveContainer" containerID="f87011cb44ef6ac465628839c9297b4a26dfe8381366f2b0d1bcbac539cc8299" Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.284051 4704 scope.go:117] "RemoveContainer" containerID="338f08b9726471fdea676b2cb113909d8cd04f6e543de7b22ccb05d0a0ee1f1f" Nov 25 16:03:51 crc kubenswrapper[4704]: E1125 16:03:51.284517 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"338f08b9726471fdea676b2cb113909d8cd04f6e543de7b22ccb05d0a0ee1f1f\": container with ID starting with 338f08b9726471fdea676b2cb113909d8cd04f6e543de7b22ccb05d0a0ee1f1f not found: ID does not exist" containerID="338f08b9726471fdea676b2cb113909d8cd04f6e543de7b22ccb05d0a0ee1f1f" Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.284593 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"338f08b9726471fdea676b2cb113909d8cd04f6e543de7b22ccb05d0a0ee1f1f"} err="failed to get container status \"338f08b9726471fdea676b2cb113909d8cd04f6e543de7b22ccb05d0a0ee1f1f\": rpc error: code = NotFound desc = could not find container \"338f08b9726471fdea676b2cb113909d8cd04f6e543de7b22ccb05d0a0ee1f1f\": container with ID starting with 338f08b9726471fdea676b2cb113909d8cd04f6e543de7b22ccb05d0a0ee1f1f not found: ID does not exist" Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.284659 4704 scope.go:117] "RemoveContainer" containerID="82cf845e3515098866a1241ef9bceeea60278a66c29144abeda36839f973d258" Nov 25 16:03:51 crc kubenswrapper[4704]: E1125 16:03:51.285090 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82cf845e3515098866a1241ef9bceeea60278a66c29144abeda36839f973d258\": container with ID starting with 82cf845e3515098866a1241ef9bceeea60278a66c29144abeda36839f973d258 not found: ID does not exist" containerID="82cf845e3515098866a1241ef9bceeea60278a66c29144abeda36839f973d258" Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.285122 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82cf845e3515098866a1241ef9bceeea60278a66c29144abeda36839f973d258"} err="failed to get container status \"82cf845e3515098866a1241ef9bceeea60278a66c29144abeda36839f973d258\": rpc error: code = NotFound desc = could not find container \"82cf845e3515098866a1241ef9bceeea60278a66c29144abeda36839f973d258\": container with ID starting with 82cf845e3515098866a1241ef9bceeea60278a66c29144abeda36839f973d258 not found: ID does not exist" Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.285140 4704 scope.go:117] "RemoveContainer" containerID="f87011cb44ef6ac465628839c9297b4a26dfe8381366f2b0d1bcbac539cc8299" Nov 25 16:03:51 crc kubenswrapper[4704]: E1125 16:03:51.285436 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f87011cb44ef6ac465628839c9297b4a26dfe8381366f2b0d1bcbac539cc8299\": container with ID starting with f87011cb44ef6ac465628839c9297b4a26dfe8381366f2b0d1bcbac539cc8299 not found: ID does not exist" containerID="f87011cb44ef6ac465628839c9297b4a26dfe8381366f2b0d1bcbac539cc8299" Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.285464 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f87011cb44ef6ac465628839c9297b4a26dfe8381366f2b0d1bcbac539cc8299"} err="failed to get container status \"f87011cb44ef6ac465628839c9297b4a26dfe8381366f2b0d1bcbac539cc8299\": rpc error: code = NotFound desc = could not find container \"f87011cb44ef6ac465628839c9297b4a26dfe8381366f2b0d1bcbac539cc8299\": container with ID starting with f87011cb44ef6ac465628839c9297b4a26dfe8381366f2b0d1bcbac539cc8299 not found: ID does not exist" Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.405992 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance90ee-account-delete-4ft46" Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.557550 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlcnf\" (UniqueName: \"kubernetes.io/projected/ba64e013-8b1b-4082-9ca9-50b24a53362f-kube-api-access-qlcnf\") pod \"ba64e013-8b1b-4082-9ca9-50b24a53362f\" (UID: \"ba64e013-8b1b-4082-9ca9-50b24a53362f\") " Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.557743 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba64e013-8b1b-4082-9ca9-50b24a53362f-operator-scripts\") pod \"ba64e013-8b1b-4082-9ca9-50b24a53362f\" (UID: \"ba64e013-8b1b-4082-9ca9-50b24a53362f\") " Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.558937 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba64e013-8b1b-4082-9ca9-50b24a53362f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba64e013-8b1b-4082-9ca9-50b24a53362f" (UID: "ba64e013-8b1b-4082-9ca9-50b24a53362f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.563239 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba64e013-8b1b-4082-9ca9-50b24a53362f-kube-api-access-qlcnf" (OuterVolumeSpecName: "kube-api-access-qlcnf") pod "ba64e013-8b1b-4082-9ca9-50b24a53362f" (UID: "ba64e013-8b1b-4082-9ca9-50b24a53362f"). InnerVolumeSpecName "kube-api-access-qlcnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.659012 4704 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba64e013-8b1b-4082-9ca9-50b24a53362f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:51 crc kubenswrapper[4704]: I1125 16:03:51.659048 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlcnf\" (UniqueName: \"kubernetes.io/projected/ba64e013-8b1b-4082-9ca9-50b24a53362f-kube-api-access-qlcnf\") on node \"crc\" DevicePath \"\"" Nov 25 16:03:52 crc kubenswrapper[4704]: I1125 16:03:52.137861 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance90ee-account-delete-4ft46" event={"ID":"ba64e013-8b1b-4082-9ca9-50b24a53362f","Type":"ContainerDied","Data":"e3b85324da8593b3ba53f92c9fc721bb113b71911d2606c430f2c73af9be5925"} Nov 25 16:03:52 crc kubenswrapper[4704]: I1125 16:03:52.137906 4704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3b85324da8593b3ba53f92c9fc721bb113b71911d2606c430f2c73af9be5925" Nov 25 16:03:52 crc kubenswrapper[4704]: I1125 16:03:52.137887 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance90ee-account-delete-4ft46" Nov 25 16:03:52 crc kubenswrapper[4704]: I1125 16:03:52.428616 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04d98209-2337-49da-a0ac-1f12810f5fb3" path="/var/lib/kubelet/pods/04d98209-2337-49da-a0ac-1f12810f5fb3/volumes" Nov 25 16:03:52 crc kubenswrapper[4704]: I1125 16:03:52.429352 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4e3b666-6607-432c-9274-a75ba8716911" path="/var/lib/kubelet/pods/d4e3b666-6607-432c-9274-a75ba8716911/volumes" Nov 25 16:03:54 crc kubenswrapper[4704]: I1125 16:03:54.116765 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-jbcwh"] Nov 25 16:03:54 crc kubenswrapper[4704]: I1125 16:03:54.123639 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-jbcwh"] Nov 25 16:03:54 crc kubenswrapper[4704]: I1125 16:03:54.130496 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance90ee-account-delete-4ft46"] Nov 25 16:03:54 crc kubenswrapper[4704]: I1125 16:03:54.136942 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance90ee-account-delete-4ft46"] Nov 25 16:03:54 crc kubenswrapper[4704]: I1125 16:03:54.143032 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-90ee-account-create-update-mcgs5"] Nov 25 16:03:54 crc kubenswrapper[4704]: I1125 16:03:54.148473 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-90ee-account-create-update-mcgs5"] Nov 25 16:03:54 crc kubenswrapper[4704]: I1125 16:03:54.425353 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ae716d7-1612-4ba4-8860-733eca08736b" path="/var/lib/kubelet/pods/3ae716d7-1612-4ba4-8860-733eca08736b/volumes" Nov 25 16:03:54 crc kubenswrapper[4704]: I1125 16:03:54.426403 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9" path="/var/lib/kubelet/pods/9b7f0a1d-2ee4-4fbe-9851-f5dde313efd9/volumes" Nov 25 16:03:54 crc kubenswrapper[4704]: I1125 16:03:54.426901 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba64e013-8b1b-4082-9ca9-50b24a53362f" path="/var/lib/kubelet/pods/ba64e013-8b1b-4082-9ca9-50b24a53362f/volumes" Nov 25 16:04:03 crc kubenswrapper[4704]: I1125 16:04:03.416232 4704 scope.go:117] "RemoveContainer" containerID="2d40455a2b5295ff7b13e8f8c3068640929072e4f22451a92dc5d7701185f9fd" Nov 25 16:04:03 crc kubenswrapper[4704]: E1125 16:04:03.417256 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djz8x_openshift-machine-config-operator(91b52682-d008-4b8a-8bc3-26b032d7dc2c)\"" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" Nov 25 16:04:17 crc kubenswrapper[4704]: I1125 16:04:17.417202 4704 scope.go:117] "RemoveContainer" containerID="2d40455a2b5295ff7b13e8f8c3068640929072e4f22451a92dc5d7701185f9fd" Nov 25 16:04:17 crc kubenswrapper[4704]: E1125 16:04:17.418058 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djz8x_openshift-machine-config-operator(91b52682-d008-4b8a-8bc3-26b032d7dc2c)\"" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" Nov 25 16:04:17 crc kubenswrapper[4704]: I1125 16:04:17.529092 4704 scope.go:117] "RemoveContainer" containerID="2c503d00362f7bb6ae694255b8afdfe642c071dcdb7d74aef84b1b9f77ede00c" Nov 25 16:04:17 crc kubenswrapper[4704]: I1125 16:04:17.553119 4704 scope.go:117] "RemoveContainer" containerID="bde0e01a25eecbb150247fe41eb9cd801171db31f44920282ed771dea6a25965" Nov 25 16:04:17 crc kubenswrapper[4704]: I1125 16:04:17.579210 4704 scope.go:117] "RemoveContainer" containerID="53830a382248bd999649d537d07e2b1fdeacd6405b8aa625425ed45926b58384" Nov 25 16:04:18 crc kubenswrapper[4704]: I1125 16:04:18.862484 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d78tg/must-gather-v5dx9"] Nov 25 16:04:18 crc kubenswrapper[4704]: E1125 16:04:18.862805 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4e3b666-6607-432c-9274-a75ba8716911" containerName="glance-api" Nov 25 16:04:18 crc kubenswrapper[4704]: I1125 16:04:18.862819 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4e3b666-6607-432c-9274-a75ba8716911" containerName="glance-api" Nov 25 16:04:18 crc kubenswrapper[4704]: E1125 16:04:18.862832 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04d98209-2337-49da-a0ac-1f12810f5fb3" containerName="glance-httpd" Nov 25 16:04:18 crc kubenswrapper[4704]: I1125 16:04:18.862839 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d98209-2337-49da-a0ac-1f12810f5fb3" containerName="glance-httpd" Nov 25 16:04:18 crc kubenswrapper[4704]: E1125 16:04:18.862856 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04d98209-2337-49da-a0ac-1f12810f5fb3" containerName="glance-log" Nov 25 16:04:18 crc kubenswrapper[4704]: I1125 16:04:18.862863 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d98209-2337-49da-a0ac-1f12810f5fb3" containerName="glance-log" Nov 25 16:04:18 crc kubenswrapper[4704]: E1125 16:04:18.862874 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba64e013-8b1b-4082-9ca9-50b24a53362f" containerName="mariadb-account-delete" Nov 25 16:04:18 crc kubenswrapper[4704]: I1125 16:04:18.862880 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba64e013-8b1b-4082-9ca9-50b24a53362f" containerName="mariadb-account-delete" Nov 25 16:04:18 crc kubenswrapper[4704]: E1125 16:04:18.862893 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04d98209-2337-49da-a0ac-1f12810f5fb3" containerName="glance-api" Nov 25 16:04:18 crc kubenswrapper[4704]: I1125 16:04:18.862898 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d98209-2337-49da-a0ac-1f12810f5fb3" containerName="glance-api" Nov 25 16:04:18 crc kubenswrapper[4704]: E1125 16:04:18.862903 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4e3b666-6607-432c-9274-a75ba8716911" containerName="glance-log" Nov 25 16:04:18 crc kubenswrapper[4704]: I1125 16:04:18.862909 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4e3b666-6607-432c-9274-a75ba8716911" containerName="glance-log" Nov 25 16:04:18 crc kubenswrapper[4704]: E1125 16:04:18.862922 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4e3b666-6607-432c-9274-a75ba8716911" containerName="glance-httpd" Nov 25 16:04:18 crc kubenswrapper[4704]: I1125 16:04:18.862927 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4e3b666-6607-432c-9274-a75ba8716911" containerName="glance-httpd" Nov 25 16:04:18 crc kubenswrapper[4704]: I1125 16:04:18.863040 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba64e013-8b1b-4082-9ca9-50b24a53362f" containerName="mariadb-account-delete" Nov 25 16:04:18 crc kubenswrapper[4704]: I1125 16:04:18.863058 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="04d98209-2337-49da-a0ac-1f12810f5fb3" containerName="glance-httpd" Nov 25 16:04:18 crc kubenswrapper[4704]: I1125 16:04:18.863065 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="04d98209-2337-49da-a0ac-1f12810f5fb3" containerName="glance-log" Nov 25 16:04:18 crc kubenswrapper[4704]: I1125 16:04:18.863070 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4e3b666-6607-432c-9274-a75ba8716911" containerName="glance-api" Nov 25 16:04:18 crc kubenswrapper[4704]: I1125 16:04:18.863082 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="04d98209-2337-49da-a0ac-1f12810f5fb3" containerName="glance-api" Nov 25 16:04:18 crc kubenswrapper[4704]: I1125 16:04:18.863091 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4e3b666-6607-432c-9274-a75ba8716911" containerName="glance-httpd" Nov 25 16:04:18 crc kubenswrapper[4704]: I1125 16:04:18.863101 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4e3b666-6607-432c-9274-a75ba8716911" containerName="glance-log" Nov 25 16:04:18 crc kubenswrapper[4704]: I1125 16:04:18.863898 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d78tg/must-gather-v5dx9" Nov 25 16:04:18 crc kubenswrapper[4704]: I1125 16:04:18.872277 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-d78tg"/"openshift-service-ca.crt" Nov 25 16:04:18 crc kubenswrapper[4704]: I1125 16:04:18.872492 4704 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-d78tg"/"default-dockercfg-nm4h7" Nov 25 16:04:18 crc kubenswrapper[4704]: I1125 16:04:18.878193 4704 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-d78tg"/"kube-root-ca.crt" Nov 25 16:04:18 crc kubenswrapper[4704]: I1125 16:04:18.904428 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d78tg/must-gather-v5dx9"] Nov 25 16:04:19 crc kubenswrapper[4704]: I1125 16:04:19.046183 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjkj9\" (UniqueName: \"kubernetes.io/projected/64da260d-a929-4455-b1ad-76b6d3d0fd38-kube-api-access-gjkj9\") pod \"must-gather-v5dx9\" (UID: \"64da260d-a929-4455-b1ad-76b6d3d0fd38\") " pod="openshift-must-gather-d78tg/must-gather-v5dx9" Nov 25 16:04:19 crc kubenswrapper[4704]: I1125 16:04:19.046647 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/64da260d-a929-4455-b1ad-76b6d3d0fd38-must-gather-output\") pod \"must-gather-v5dx9\" (UID: \"64da260d-a929-4455-b1ad-76b6d3d0fd38\") " pod="openshift-must-gather-d78tg/must-gather-v5dx9" Nov 25 16:04:19 crc kubenswrapper[4704]: I1125 16:04:19.147904 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjkj9\" (UniqueName: \"kubernetes.io/projected/64da260d-a929-4455-b1ad-76b6d3d0fd38-kube-api-access-gjkj9\") pod \"must-gather-v5dx9\" (UID: \"64da260d-a929-4455-b1ad-76b6d3d0fd38\") " pod="openshift-must-gather-d78tg/must-gather-v5dx9" Nov 25 16:04:19 crc kubenswrapper[4704]: I1125 16:04:19.148011 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/64da260d-a929-4455-b1ad-76b6d3d0fd38-must-gather-output\") pod \"must-gather-v5dx9\" (UID: \"64da260d-a929-4455-b1ad-76b6d3d0fd38\") " pod="openshift-must-gather-d78tg/must-gather-v5dx9" Nov 25 16:04:19 crc kubenswrapper[4704]: I1125 16:04:19.148815 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/64da260d-a929-4455-b1ad-76b6d3d0fd38-must-gather-output\") pod \"must-gather-v5dx9\" (UID: \"64da260d-a929-4455-b1ad-76b6d3d0fd38\") " pod="openshift-must-gather-d78tg/must-gather-v5dx9" Nov 25 16:04:19 crc kubenswrapper[4704]: I1125 16:04:19.168265 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjkj9\" (UniqueName: \"kubernetes.io/projected/64da260d-a929-4455-b1ad-76b6d3d0fd38-kube-api-access-gjkj9\") pod \"must-gather-v5dx9\" (UID: \"64da260d-a929-4455-b1ad-76b6d3d0fd38\") " pod="openshift-must-gather-d78tg/must-gather-v5dx9" Nov 25 16:04:19 crc kubenswrapper[4704]: I1125 16:04:19.189404 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d78tg/must-gather-v5dx9" Nov 25 16:04:19 crc kubenswrapper[4704]: I1125 16:04:19.391141 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d78tg/must-gather-v5dx9"] Nov 25 16:04:19 crc kubenswrapper[4704]: I1125 16:04:19.399120 4704 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 16:04:20 crc kubenswrapper[4704]: I1125 16:04:20.348316 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d78tg/must-gather-v5dx9" event={"ID":"64da260d-a929-4455-b1ad-76b6d3d0fd38","Type":"ContainerStarted","Data":"3965877de7a02a54780b9d60ce4b0e06dc6cc9b6b9533944b3586ea14910612e"} Nov 25 16:04:23 crc kubenswrapper[4704]: I1125 16:04:23.373625 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d78tg/must-gather-v5dx9" event={"ID":"64da260d-a929-4455-b1ad-76b6d3d0fd38","Type":"ContainerStarted","Data":"f1afbef17ebf9f71bcf99c8d9c11bbbf4a0d8abd9454d590113ef5538a28f1db"} Nov 25 16:04:24 crc kubenswrapper[4704]: I1125 16:04:24.383508 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d78tg/must-gather-v5dx9" event={"ID":"64da260d-a929-4455-b1ad-76b6d3d0fd38","Type":"ContainerStarted","Data":"e8eee619e08966294a86f60a8ba81cfb374a29c9134edb82c64be286b8b2038a"} Nov 25 16:04:24 crc kubenswrapper[4704]: I1125 16:04:24.399764 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d78tg/must-gather-v5dx9" podStartSLOduration=2.80116285 podStartE2EDuration="6.399748037s" podCreationTimestamp="2025-11-25 16:04:18 +0000 UTC" firstStartedPulling="2025-11-25 16:04:19.398859146 +0000 UTC m=+1745.667132927" lastFinishedPulling="2025-11-25 16:04:22.997444333 +0000 UTC m=+1749.265718114" observedRunningTime="2025-11-25 16:04:24.398156971 +0000 UTC m=+1750.666430752" watchObservedRunningTime="2025-11-25 16:04:24.399748037 +0000 UTC m=+1750.668021818" Nov 25 16:04:33 crc kubenswrapper[4704]: I1125 16:04:33.416650 4704 scope.go:117] "RemoveContainer" containerID="2d40455a2b5295ff7b13e8f8c3068640929072e4f22451a92dc5d7701185f9fd" Nov 25 16:04:33 crc kubenswrapper[4704]: E1125 16:04:33.417679 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djz8x_openshift-machine-config-operator(91b52682-d008-4b8a-8bc3-26b032d7dc2c)\"" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" Nov 25 16:04:44 crc kubenswrapper[4704]: I1125 16:04:44.420522 4704 scope.go:117] "RemoveContainer" containerID="2d40455a2b5295ff7b13e8f8c3068640929072e4f22451a92dc5d7701185f9fd" Nov 25 16:04:44 crc kubenswrapper[4704]: E1125 16:04:44.421601 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djz8x_openshift-machine-config-operator(91b52682-d008-4b8a-8bc3-26b032d7dc2c)\"" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" Nov 25 16:04:57 crc kubenswrapper[4704]: I1125 16:04:57.195552 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt_7418c389-acf6-4fe8-b7be-b149451e186a/util/0.log" Nov 25 16:04:57 crc kubenswrapper[4704]: I1125 16:04:57.365010 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt_7418c389-acf6-4fe8-b7be-b149451e186a/util/0.log" Nov 25 16:04:57 crc kubenswrapper[4704]: I1125 16:04:57.420096 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt_7418c389-acf6-4fe8-b7be-b149451e186a/pull/0.log" Nov 25 16:04:57 crc kubenswrapper[4704]: I1125 16:04:57.435440 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt_7418c389-acf6-4fe8-b7be-b149451e186a/pull/0.log" Nov 25 16:04:57 crc kubenswrapper[4704]: I1125 16:04:57.591660 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt_7418c389-acf6-4fe8-b7be-b149451e186a/pull/0.log" Nov 25 16:04:57 crc kubenswrapper[4704]: I1125 16:04:57.594469 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt_7418c389-acf6-4fe8-b7be-b149451e186a/util/0.log" Nov 25 16:04:57 crc kubenswrapper[4704]: I1125 16:04:57.621110 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2f4edfcf5dbe2bb1a8f2f28075d211bc098902ebe0cc05af48345acc3d8nzlt_7418c389-acf6-4fe8-b7be-b149451e186a/extract/0.log" Nov 25 16:04:57 crc kubenswrapper[4704]: I1125 16:04:57.773922 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s_25f90ba3-d788-402c-9d77-5653b1b3bed6/util/0.log" Nov 25 16:04:57 crc kubenswrapper[4704]: I1125 16:04:57.943516 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s_25f90ba3-d788-402c-9d77-5653b1b3bed6/util/0.log" Nov 25 16:04:57 crc kubenswrapper[4704]: I1125 16:04:57.962155 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s_25f90ba3-d788-402c-9d77-5653b1b3bed6/pull/0.log" Nov 25 16:04:57 crc kubenswrapper[4704]: I1125 16:04:57.968390 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s_25f90ba3-d788-402c-9d77-5653b1b3bed6/pull/0.log" Nov 25 16:04:58 crc kubenswrapper[4704]: I1125 16:04:58.115827 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s_25f90ba3-d788-402c-9d77-5653b1b3bed6/util/0.log" Nov 25 16:04:58 crc kubenswrapper[4704]: I1125 16:04:58.171191 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s_25f90ba3-d788-402c-9d77-5653b1b3bed6/extract/0.log" Nov 25 16:04:58 crc kubenswrapper[4704]: I1125 16:04:58.272761 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5d473c3169f40b179d14921c90af2c8546b7b757fe551b7dba7d903f5dp754s_25f90ba3-d788-402c-9d77-5653b1b3bed6/pull/0.log" Nov 25 16:04:58 crc kubenswrapper[4704]: I1125 16:04:58.306558 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k_68b8e4d2-c04c-470e-a4c8-debcf659c143/util/0.log" Nov 25 16:04:58 crc kubenswrapper[4704]: I1125 16:04:58.482483 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k_68b8e4d2-c04c-470e-a4c8-debcf659c143/pull/0.log" Nov 25 16:04:58 crc kubenswrapper[4704]: I1125 16:04:58.507279 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k_68b8e4d2-c04c-470e-a4c8-debcf659c143/pull/0.log" Nov 25 16:04:58 crc kubenswrapper[4704]: I1125 16:04:58.533337 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k_68b8e4d2-c04c-470e-a4c8-debcf659c143/util/0.log" Nov 25 16:04:58 crc kubenswrapper[4704]: I1125 16:04:58.681632 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k_68b8e4d2-c04c-470e-a4c8-debcf659c143/extract/0.log" Nov 25 16:04:58 crc kubenswrapper[4704]: I1125 16:04:58.687366 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k_68b8e4d2-c04c-470e-a4c8-debcf659c143/pull/0.log" Nov 25 16:04:58 crc kubenswrapper[4704]: I1125 16:04:58.707549 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_87b4bb7621dcb67338b53778f2871f07aa0e4d3dfcd0fd25724bfd240b7z89k_68b8e4d2-c04c-470e-a4c8-debcf659c143/util/0.log" Nov 25 16:04:58 crc kubenswrapper[4704]: I1125 16:04:58.878627 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs_c6f65fa5-cbd1-45a8-8b39-0255370b20c4/util/0.log" Nov 25 16:04:59 crc kubenswrapper[4704]: I1125 16:04:59.057200 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs_c6f65fa5-cbd1-45a8-8b39-0255370b20c4/pull/0.log" Nov 25 16:04:59 crc kubenswrapper[4704]: I1125 16:04:59.072642 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs_c6f65fa5-cbd1-45a8-8b39-0255370b20c4/util/0.log" Nov 25 16:04:59 crc kubenswrapper[4704]: I1125 16:04:59.076069 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs_c6f65fa5-cbd1-45a8-8b39-0255370b20c4/pull/0.log" Nov 25 16:04:59 crc kubenswrapper[4704]: I1125 16:04:59.379085 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs_c6f65fa5-cbd1-45a8-8b39-0255370b20c4/pull/0.log" Nov 25 16:04:59 crc kubenswrapper[4704]: I1125 16:04:59.395207 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs_c6f65fa5-cbd1-45a8-8b39-0255370b20c4/extract/0.log" Nov 25 16:04:59 crc kubenswrapper[4704]: I1125 16:04:59.416354 4704 scope.go:117] "RemoveContainer" containerID="2d40455a2b5295ff7b13e8f8c3068640929072e4f22451a92dc5d7701185f9fd" Nov 25 16:04:59 crc kubenswrapper[4704]: E1125 16:04:59.416815 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djz8x_openshift-machine-config-operator(91b52682-d008-4b8a-8bc3-26b032d7dc2c)\"" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" Nov 25 16:04:59 crc kubenswrapper[4704]: I1125 16:04:59.430215 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59092hzs_c6f65fa5-cbd1-45a8-8b39-0255370b20c4/util/0.log" Nov 25 16:04:59 crc kubenswrapper[4704]: I1125 16:04:59.581754 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2_58a8a704-aa96-4788-825e-a343803ac76b/util/0.log" Nov 25 16:04:59 crc kubenswrapper[4704]: I1125 16:04:59.768367 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2_58a8a704-aa96-4788-825e-a343803ac76b/util/0.log" Nov 25 16:04:59 crc kubenswrapper[4704]: I1125 16:04:59.790035 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2_58a8a704-aa96-4788-825e-a343803ac76b/pull/0.log" Nov 25 16:04:59 crc kubenswrapper[4704]: I1125 16:04:59.823581 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2_58a8a704-aa96-4788-825e-a343803ac76b/pull/0.log" Nov 25 16:04:59 crc kubenswrapper[4704]: I1125 16:04:59.991089 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2_58a8a704-aa96-4788-825e-a343803ac76b/pull/0.log" Nov 25 16:05:00 crc kubenswrapper[4704]: I1125 16:05:00.015576 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2_58a8a704-aa96-4788-825e-a343803ac76b/extract/0.log" Nov 25 16:05:00 crc kubenswrapper[4704]: I1125 16:05:00.036717 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9f0c59a3968beec894e04476dd5efd0a707bad85f482efd4940498368crcrf2_58a8a704-aa96-4788-825e-a343803ac76b/util/0.log" Nov 25 16:05:00 crc kubenswrapper[4704]: I1125 16:05:00.169669 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56_3649f81a-9887-4dba-91e7-66192abf74df/util/0.log" Nov 25 16:05:00 crc kubenswrapper[4704]: I1125 16:05:00.377736 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56_3649f81a-9887-4dba-91e7-66192abf74df/util/0.log" Nov 25 16:05:00 crc kubenswrapper[4704]: I1125 16:05:00.387396 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56_3649f81a-9887-4dba-91e7-66192abf74df/pull/0.log" Nov 25 16:05:00 crc kubenswrapper[4704]: I1125 16:05:00.413585 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56_3649f81a-9887-4dba-91e7-66192abf74df/pull/0.log" Nov 25 16:05:00 crc kubenswrapper[4704]: I1125 16:05:00.539348 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56_3649f81a-9887-4dba-91e7-66192abf74df/util/0.log" Nov 25 16:05:00 crc kubenswrapper[4704]: I1125 16:05:00.568546 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56_3649f81a-9887-4dba-91e7-66192abf74df/extract/0.log" Nov 25 16:05:00 crc kubenswrapper[4704]: I1125 16:05:00.576997 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c976308faac62824ee875fa80dce4db57a79e32adb8a627dd31cdf72f6cgx56_3649f81a-9887-4dba-91e7-66192abf74df/pull/0.log" Nov 25 16:05:00 crc kubenswrapper[4704]: I1125 16:05:00.623112 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt_1f032655-f673-4d19-a90f-67d2e4cbc198/util/0.log" Nov 25 16:05:00 crc kubenswrapper[4704]: I1125 16:05:00.804854 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt_1f032655-f673-4d19-a90f-67d2e4cbc198/pull/0.log" Nov 25 16:05:00 crc kubenswrapper[4704]: I1125 16:05:00.806468 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt_1f032655-f673-4d19-a90f-67d2e4cbc198/util/0.log" Nov 25 16:05:00 crc kubenswrapper[4704]: I1125 16:05:00.827107 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt_1f032655-f673-4d19-a90f-67d2e4cbc198/pull/0.log" Nov 25 16:05:01 crc kubenswrapper[4704]: I1125 16:05:01.027912 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt_1f032655-f673-4d19-a90f-67d2e4cbc198/extract/0.log" Nov 25 16:05:01 crc kubenswrapper[4704]: I1125 16:05:01.032952 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt_1f032655-f673-4d19-a90f-67d2e4cbc198/pull/0.log" Nov 25 16:05:01 crc kubenswrapper[4704]: I1125 16:05:01.081849 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d854280893f664a16f85f7c4268f877fa95509a4e25ae77fea242eaaa3959pt_1f032655-f673-4d19-a90f-67d2e4cbc198/util/0.log" Nov 25 16:05:01 crc kubenswrapper[4704]: I1125 16:05:01.136985 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-9fd6d6f67-vlkl7_3649b0e8-675b-4b8f-9d4a-ff24b9edf553/manager/0.log" Nov 25 16:05:01 crc kubenswrapper[4704]: I1125 16:05:01.232454 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-index-ph68v_d2af75d4-d76a-48ea-baa4-0ce23b299e48/registry-server/0.log" Nov 25 16:05:01 crc kubenswrapper[4704]: I1125 16:05:01.312943 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5d4c77ff9c-lgm4t_13eebb33-6991-4747-883d-c13102f4a922/manager/0.log" Nov 25 16:05:01 crc kubenswrapper[4704]: I1125 16:05:01.439379 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-index-24d6s_8200c73c-66eb-457b-8cdc-c3773b532d29/registry-server/0.log" Nov 25 16:05:01 crc kubenswrapper[4704]: I1125 16:05:01.523988 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-67cd7d6948-7k7tj_8f214e62-ec20-41c1-835c-1daab12028a0/kube-rbac-proxy/0.log" Nov 25 16:05:01 crc kubenswrapper[4704]: I1125 16:05:01.531132 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-67cd7d6948-7k7tj_8f214e62-ec20-41c1-835c-1daab12028a0/manager/0.log" Nov 25 16:05:01 crc kubenswrapper[4704]: I1125 16:05:01.664876 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-index-sdvpn_5908a0b9-0b90-4ea4-a3f3-2a67f15fd3b5/registry-server/0.log" Nov 25 16:05:01 crc kubenswrapper[4704]: I1125 16:05:01.762269 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5684f64755-gm29f_440b2c47-17eb-4f7f-893e-7ccc849d2557/manager/0.log" Nov 25 16:05:01 crc kubenswrapper[4704]: I1125 16:05:01.837915 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-index-mdz8p_17f6718f-4687-4f52-827a-479e1af368ed/registry-server/0.log" Nov 25 16:05:01 crc kubenswrapper[4704]: I1125 16:05:01.872936 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c5bb565f-pcghm_2bd8f053-a003-406f-abd4-fabbee649785/manager/0.log" Nov 25 16:05:02 crc kubenswrapper[4704]: I1125 16:05:02.026209 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-index-cmwzk_9997000b-ed10-4ef0-9456-02578320964d/registry-server/0.log" Nov 25 16:05:02 crc kubenswrapper[4704]: I1125 16:05:02.080468 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-779fc9694b-848dk_e8a33191-6af5-44c1-8f3f-74c8e186a7e3/operator/0.log" Nov 25 16:05:02 crc kubenswrapper[4704]: I1125 16:05:02.155402 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-index-99mng_14f98924-1b85-429a-81dd-2d1b3f836464/registry-server/0.log" Nov 25 16:05:02 crc kubenswrapper[4704]: I1125 16:05:02.252374 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6d895d4c49-gsd7z_3af84a01-fe4f-44f8-88a2-e68ae8855933/manager/0.log" Nov 25 16:05:02 crc kubenswrapper[4704]: I1125 16:05:02.336021 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-index-9fzqh_3988efd6-25d0-48aa-8750-aad3d5d9c525/registry-server/0.log" Nov 25 16:05:12 crc kubenswrapper[4704]: I1125 16:05:12.417021 4704 scope.go:117] "RemoveContainer" containerID="2d40455a2b5295ff7b13e8f8c3068640929072e4f22451a92dc5d7701185f9fd" Nov 25 16:05:12 crc kubenswrapper[4704]: E1125 16:05:12.418072 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djz8x_openshift-machine-config-operator(91b52682-d008-4b8a-8bc3-26b032d7dc2c)\"" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" Nov 25 16:05:17 crc kubenswrapper[4704]: I1125 16:05:17.171984 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-kklq9_b8100f5c-f741-4ef6-b462-d2ce26957517/control-plane-machine-set-operator/0.log" Nov 25 16:05:17 crc kubenswrapper[4704]: I1125 16:05:17.356293 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-fz52t_7b416e1d-da7d-4da7-9bae-210c815d4cf1/kube-rbac-proxy/0.log" Nov 25 16:05:17 crc kubenswrapper[4704]: I1125 16:05:17.384076 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-fz52t_7b416e1d-da7d-4da7-9bae-210c815d4cf1/machine-api-operator/0.log" Nov 25 16:05:24 crc kubenswrapper[4704]: I1125 16:05:24.419596 4704 scope.go:117] "RemoveContainer" containerID="2d40455a2b5295ff7b13e8f8c3068640929072e4f22451a92dc5d7701185f9fd" Nov 25 16:05:24 crc kubenswrapper[4704]: E1125 16:05:24.419998 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djz8x_openshift-machine-config-operator(91b52682-d008-4b8a-8bc3-26b032d7dc2c)\"" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" Nov 25 16:05:34 crc kubenswrapper[4704]: I1125 16:05:34.064549 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-2bp8m_3e21f57a-b980-4bb4-8367-b2c3216b5e17/kube-rbac-proxy/0.log" Nov 25 16:05:34 crc kubenswrapper[4704]: I1125 16:05:34.160490 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-2bp8m_3e21f57a-b980-4bb4-8367-b2c3216b5e17/controller/0.log" Nov 25 16:05:34 crc kubenswrapper[4704]: I1125 16:05:34.266415 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dllmv_29b053f9-2ac9-4eeb-bb2b-adbe17dfab59/cp-frr-files/0.log" Nov 25 16:05:34 crc kubenswrapper[4704]: I1125 16:05:34.537655 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dllmv_29b053f9-2ac9-4eeb-bb2b-adbe17dfab59/cp-reloader/0.log" Nov 25 16:05:34 crc kubenswrapper[4704]: I1125 16:05:34.545511 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dllmv_29b053f9-2ac9-4eeb-bb2b-adbe17dfab59/cp-frr-files/0.log" Nov 25 16:05:34 crc kubenswrapper[4704]: I1125 16:05:34.612181 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dllmv_29b053f9-2ac9-4eeb-bb2b-adbe17dfab59/cp-reloader/0.log" Nov 25 16:05:34 crc kubenswrapper[4704]: I1125 16:05:34.653345 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dllmv_29b053f9-2ac9-4eeb-bb2b-adbe17dfab59/cp-metrics/0.log" Nov 25 16:05:34 crc kubenswrapper[4704]: I1125 16:05:34.813955 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dllmv_29b053f9-2ac9-4eeb-bb2b-adbe17dfab59/cp-reloader/0.log" Nov 25 16:05:34 crc kubenswrapper[4704]: I1125 16:05:34.825335 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dllmv_29b053f9-2ac9-4eeb-bb2b-adbe17dfab59/cp-metrics/0.log" Nov 25 16:05:34 crc kubenswrapper[4704]: I1125 16:05:34.885759 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dllmv_29b053f9-2ac9-4eeb-bb2b-adbe17dfab59/cp-frr-files/0.log" Nov 25 16:05:34 crc kubenswrapper[4704]: I1125 16:05:34.911458 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dllmv_29b053f9-2ac9-4eeb-bb2b-adbe17dfab59/cp-metrics/0.log" Nov 25 16:05:35 crc kubenswrapper[4704]: I1125 16:05:35.086236 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dllmv_29b053f9-2ac9-4eeb-bb2b-adbe17dfab59/cp-reloader/0.log" Nov 25 16:05:35 crc kubenswrapper[4704]: I1125 16:05:35.113064 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dllmv_29b053f9-2ac9-4eeb-bb2b-adbe17dfab59/controller/0.log" Nov 25 16:05:35 crc kubenswrapper[4704]: I1125 16:05:35.127302 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dllmv_29b053f9-2ac9-4eeb-bb2b-adbe17dfab59/cp-frr-files/0.log" Nov 25 16:05:35 crc kubenswrapper[4704]: I1125 16:05:35.135086 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dllmv_29b053f9-2ac9-4eeb-bb2b-adbe17dfab59/cp-metrics/0.log" Nov 25 16:05:35 crc kubenswrapper[4704]: I1125 16:05:35.319570 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dllmv_29b053f9-2ac9-4eeb-bb2b-adbe17dfab59/frr-metrics/0.log" Nov 25 16:05:35 crc kubenswrapper[4704]: I1125 16:05:35.379898 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dllmv_29b053f9-2ac9-4eeb-bb2b-adbe17dfab59/kube-rbac-proxy-frr/0.log" Nov 25 16:05:35 crc kubenswrapper[4704]: I1125 16:05:35.389244 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dllmv_29b053f9-2ac9-4eeb-bb2b-adbe17dfab59/kube-rbac-proxy/0.log" Nov 25 16:05:35 crc kubenswrapper[4704]: I1125 16:05:35.538206 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dllmv_29b053f9-2ac9-4eeb-bb2b-adbe17dfab59/reloader/0.log" Nov 25 16:05:35 crc kubenswrapper[4704]: I1125 16:05:35.652498 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-6998585d5-l8hl8_ebf01769-76d3-4f64-bd68-f20add5a1266/frr-k8s-webhook-server/0.log" Nov 25 16:05:35 crc kubenswrapper[4704]: I1125 16:05:35.825482 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-dllmv_29b053f9-2ac9-4eeb-bb2b-adbe17dfab59/frr/0.log" Nov 25 16:05:35 crc kubenswrapper[4704]: I1125 16:05:35.922005 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5f4957f9b7-lcxrs_d7e15179-0dfe-4339-a233-4ebea59bc0f6/manager/0.log" Nov 25 16:05:36 crc kubenswrapper[4704]: I1125 16:05:36.093731 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-65c6dc9bcf-5m5qc_ca5ed8d1-6524-4d56-a49a-afb3cf8a5320/webhook-server/0.log" Nov 25 16:05:36 crc kubenswrapper[4704]: I1125 16:05:36.149996 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-lbpq9_0f413579-b97a-443b-8487-b1424b1e5a4e/kube-rbac-proxy/0.log" Nov 25 16:05:36 crc kubenswrapper[4704]: I1125 16:05:36.264855 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-lbpq9_0f413579-b97a-443b-8487-b1424b1e5a4e/speaker/0.log" Nov 25 16:05:39 crc kubenswrapper[4704]: I1125 16:05:39.417326 4704 scope.go:117] "RemoveContainer" containerID="2d40455a2b5295ff7b13e8f8c3068640929072e4f22451a92dc5d7701185f9fd" Nov 25 16:05:39 crc kubenswrapper[4704]: E1125 16:05:39.417967 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djz8x_openshift-machine-config-operator(91b52682-d008-4b8a-8bc3-26b032d7dc2c)\"" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" Nov 25 16:05:49 crc kubenswrapper[4704]: I1125 16:05:49.367866 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_keystone-cron-29401441-64x6t_021a02df-e360-4c89-8f31-a891a0b3286e/keystone-cron/0.log" Nov 25 16:05:49 crc kubenswrapper[4704]: I1125 16:05:49.785018 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-0_4067b873-5563-4077-b020-6464d703ddd4/mysql-bootstrap/0.log" Nov 25 16:05:49 crc kubenswrapper[4704]: I1125 16:05:49.806275 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_keystone-fc478b69-fc2jj_45254a49-d34b-464a-97ec-0b04cbd7c1fe/keystone-api/0.log" Nov 25 16:05:50 crc kubenswrapper[4704]: I1125 16:05:50.035459 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-0_4067b873-5563-4077-b020-6464d703ddd4/mysql-bootstrap/0.log" Nov 25 16:05:50 crc kubenswrapper[4704]: I1125 16:05:50.057755 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-0_4067b873-5563-4077-b020-6464d703ddd4/galera/0.log" Nov 25 16:05:50 crc kubenswrapper[4704]: I1125 16:05:50.246902 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-1_18b12599-9af6-4deb-8943-c8048a44a236/mysql-bootstrap/0.log" Nov 25 16:05:50 crc kubenswrapper[4704]: I1125 16:05:50.518053 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-1_18b12599-9af6-4deb-8943-c8048a44a236/mysql-bootstrap/0.log" Nov 25 16:05:50 crc kubenswrapper[4704]: I1125 16:05:50.596269 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-1_18b12599-9af6-4deb-8943-c8048a44a236/galera/0.log" Nov 25 16:05:50 crc kubenswrapper[4704]: I1125 16:05:50.751353 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-2_65796556-fea5-482e-a4e8-883f027c30ba/mysql-bootstrap/0.log" Nov 25 16:05:50 crc kubenswrapper[4704]: I1125 16:05:50.971841 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-2_65796556-fea5-482e-a4e8-883f027c30ba/mysql-bootstrap/0.log" Nov 25 16:05:51 crc kubenswrapper[4704]: I1125 16:05:51.045831 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-2_65796556-fea5-482e-a4e8-883f027c30ba/galera/0.log" Nov 25 16:05:51 crc kubenswrapper[4704]: I1125 16:05:51.151863 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstackclient_22e6497e-244f-43ec-be2f-bb10b67a619e/openstackclient/0.log" Nov 25 16:05:51 crc kubenswrapper[4704]: I1125 16:05:51.290196 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_rabbitmq-server-0_bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a/setup-container/0.log" Nov 25 16:05:51 crc kubenswrapper[4704]: I1125 16:05:51.292021 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_memcached-0_6366e7fa-3e60-4828-9571-a04c313af8df/memcached/0.log" Nov 25 16:05:51 crc kubenswrapper[4704]: I1125 16:05:51.519278 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_rabbitmq-server-0_bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a/setup-container/0.log" Nov 25 16:05:51 crc kubenswrapper[4704]: I1125 16:05:51.549769 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-proxy-6bd58cfcf7-ctjch_06d09490-29de-42d3-a33f-067f3c9ba573/proxy-httpd/0.log" Nov 25 16:05:51 crc kubenswrapper[4704]: I1125 16:05:51.585676 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_rabbitmq-server-0_bbf3d0e9-b6e9-4d83-ad0c-f0843dbb9b1a/rabbitmq/0.log" Nov 25 16:05:51 crc kubenswrapper[4704]: I1125 16:05:51.723570 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-proxy-6bd58cfcf7-ctjch_06d09490-29de-42d3-a33f-067f3c9ba573/proxy-server/0.log" Nov 25 16:05:51 crc kubenswrapper[4704]: I1125 16:05:51.838695 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-ring-rebalance-m7p8n_f25398c6-78b6-4b82-b3bf-ac1037e47998/swift-ring-rebalance/0.log" Nov 25 16:05:51 crc kubenswrapper[4704]: I1125 16:05:51.960390 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ace2c1c5-31ae-43db-891a-6a587176c215/account-reaper/0.log" Nov 25 16:05:51 crc kubenswrapper[4704]: I1125 16:05:51.993503 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ace2c1c5-31ae-43db-891a-6a587176c215/account-auditor/0.log" Nov 25 16:05:52 crc kubenswrapper[4704]: I1125 16:05:52.060134 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ace2c1c5-31ae-43db-891a-6a587176c215/account-replicator/0.log" Nov 25 16:05:52 crc kubenswrapper[4704]: I1125 16:05:52.062298 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ace2c1c5-31ae-43db-891a-6a587176c215/account-server/0.log" Nov 25 16:05:52 crc kubenswrapper[4704]: I1125 16:05:52.164530 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ace2c1c5-31ae-43db-891a-6a587176c215/container-replicator/0.log" Nov 25 16:05:52 crc kubenswrapper[4704]: I1125 16:05:52.191191 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ace2c1c5-31ae-43db-891a-6a587176c215/container-auditor/0.log" Nov 25 16:05:52 crc kubenswrapper[4704]: I1125 16:05:52.274940 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ace2c1c5-31ae-43db-891a-6a587176c215/container-updater/0.log" Nov 25 16:05:52 crc kubenswrapper[4704]: I1125 16:05:52.283434 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ace2c1c5-31ae-43db-891a-6a587176c215/container-server/0.log" Nov 25 16:05:52 crc kubenswrapper[4704]: I1125 16:05:52.385564 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ace2c1c5-31ae-43db-891a-6a587176c215/object-auditor/0.log" Nov 25 16:05:52 crc kubenswrapper[4704]: I1125 16:05:52.407368 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ace2c1c5-31ae-43db-891a-6a587176c215/object-expirer/0.log" Nov 25 16:05:52 crc kubenswrapper[4704]: I1125 16:05:52.417295 4704 scope.go:117] "RemoveContainer" containerID="2d40455a2b5295ff7b13e8f8c3068640929072e4f22451a92dc5d7701185f9fd" Nov 25 16:05:52 crc kubenswrapper[4704]: E1125 16:05:52.417593 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djz8x_openshift-machine-config-operator(91b52682-d008-4b8a-8bc3-26b032d7dc2c)\"" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" Nov 25 16:05:52 crc kubenswrapper[4704]: I1125 16:05:52.635337 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ace2c1c5-31ae-43db-891a-6a587176c215/object-replicator/0.log" Nov 25 16:05:52 crc kubenswrapper[4704]: I1125 16:05:52.654225 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ace2c1c5-31ae-43db-891a-6a587176c215/object-server/0.log" Nov 25 16:05:52 crc kubenswrapper[4704]: I1125 16:05:52.742951 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ace2c1c5-31ae-43db-891a-6a587176c215/object-updater/0.log" Nov 25 16:05:52 crc kubenswrapper[4704]: I1125 16:05:52.746688 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ace2c1c5-31ae-43db-891a-6a587176c215/rsync/0.log" Nov 25 16:05:52 crc kubenswrapper[4704]: I1125 16:05:52.850231 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_ace2c1c5-31ae-43db-891a-6a587176c215/swift-recon-cron/0.log" Nov 25 16:06:05 crc kubenswrapper[4704]: I1125 16:06:05.109017 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qm54c_4912dfbf-fbd9-41d7-aba3-0a02558ab662/extract-utilities/0.log" Nov 25 16:06:05 crc kubenswrapper[4704]: I1125 16:06:05.293564 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qm54c_4912dfbf-fbd9-41d7-aba3-0a02558ab662/extract-utilities/0.log" Nov 25 16:06:05 crc kubenswrapper[4704]: I1125 16:06:05.369631 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qm54c_4912dfbf-fbd9-41d7-aba3-0a02558ab662/extract-content/0.log" Nov 25 16:06:05 crc kubenswrapper[4704]: I1125 16:06:05.383759 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qm54c_4912dfbf-fbd9-41d7-aba3-0a02558ab662/extract-content/0.log" Nov 25 16:06:05 crc kubenswrapper[4704]: I1125 16:06:05.649281 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qm54c_4912dfbf-fbd9-41d7-aba3-0a02558ab662/extract-content/0.log" Nov 25 16:06:05 crc kubenswrapper[4704]: I1125 16:06:05.695824 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qm54c_4912dfbf-fbd9-41d7-aba3-0a02558ab662/extract-utilities/0.log" Nov 25 16:06:05 crc kubenswrapper[4704]: I1125 16:06:05.899506 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s7lr8_f11b0bae-8584-41bd-9970-af1f50073c21/extract-utilities/0.log" Nov 25 16:06:06 crc kubenswrapper[4704]: I1125 16:06:06.060614 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s7lr8_f11b0bae-8584-41bd-9970-af1f50073c21/extract-utilities/0.log" Nov 25 16:06:06 crc kubenswrapper[4704]: I1125 16:06:06.097280 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s7lr8_f11b0bae-8584-41bd-9970-af1f50073c21/extract-content/0.log" Nov 25 16:06:06 crc kubenswrapper[4704]: I1125 16:06:06.141487 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s7lr8_f11b0bae-8584-41bd-9970-af1f50073c21/extract-content/0.log" Nov 25 16:06:06 crc kubenswrapper[4704]: I1125 16:06:06.302724 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qm54c_4912dfbf-fbd9-41d7-aba3-0a02558ab662/registry-server/0.log" Nov 25 16:06:06 crc kubenswrapper[4704]: I1125 16:06:06.348592 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s7lr8_f11b0bae-8584-41bd-9970-af1f50073c21/extract-utilities/0.log" Nov 25 16:06:06 crc kubenswrapper[4704]: I1125 16:06:06.381386 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s7lr8_f11b0bae-8584-41bd-9970-af1f50073c21/extract-content/0.log" Nov 25 16:06:06 crc kubenswrapper[4704]: I1125 16:06:06.426106 4704 scope.go:117] "RemoveContainer" containerID="2d40455a2b5295ff7b13e8f8c3068640929072e4f22451a92dc5d7701185f9fd" Nov 25 16:06:06 crc kubenswrapper[4704]: E1125 16:06:06.426363 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djz8x_openshift-machine-config-operator(91b52682-d008-4b8a-8bc3-26b032d7dc2c)\"" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" Nov 25 16:06:06 crc kubenswrapper[4704]: I1125 16:06:06.650425 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh_fe45db4d-f0e6-4706-8d50-f9777e8aff80/util/0.log" Nov 25 16:06:06 crc kubenswrapper[4704]: I1125 16:06:06.756756 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh_fe45db4d-f0e6-4706-8d50-f9777e8aff80/util/0.log" Nov 25 16:06:06 crc kubenswrapper[4704]: I1125 16:06:06.890442 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh_fe45db4d-f0e6-4706-8d50-f9777e8aff80/pull/0.log" Nov 25 16:06:06 crc kubenswrapper[4704]: I1125 16:06:06.947317 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh_fe45db4d-f0e6-4706-8d50-f9777e8aff80/pull/0.log" Nov 25 16:06:06 crc kubenswrapper[4704]: I1125 16:06:06.998608 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s7lr8_f11b0bae-8584-41bd-9970-af1f50073c21/registry-server/0.log" Nov 25 16:06:07 crc kubenswrapper[4704]: I1125 16:06:07.114496 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh_fe45db4d-f0e6-4706-8d50-f9777e8aff80/pull/0.log" Nov 25 16:06:07 crc kubenswrapper[4704]: I1125 16:06:07.123542 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh_fe45db4d-f0e6-4706-8d50-f9777e8aff80/util/0.log" Nov 25 16:06:07 crc kubenswrapper[4704]: I1125 16:06:07.132457 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c628kgh_fe45db4d-f0e6-4706-8d50-f9777e8aff80/extract/0.log" Nov 25 16:06:07 crc kubenswrapper[4704]: I1125 16:06:07.303561 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-crs88_54e9da8e-917b-4a46-9fe9-725f950fced1/marketplace-operator/0.log" Nov 25 16:06:07 crc kubenswrapper[4704]: I1125 16:06:07.368649 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zqxfm_c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13/extract-utilities/0.log" Nov 25 16:06:07 crc kubenswrapper[4704]: I1125 16:06:07.544362 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zqxfm_c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13/extract-utilities/0.log" Nov 25 16:06:07 crc kubenswrapper[4704]: I1125 16:06:07.561850 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zqxfm_c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13/extract-content/0.log" Nov 25 16:06:07 crc kubenswrapper[4704]: I1125 16:06:07.566049 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zqxfm_c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13/extract-content/0.log" Nov 25 16:06:07 crc kubenswrapper[4704]: I1125 16:06:07.731241 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zqxfm_c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13/extract-content/0.log" Nov 25 16:06:07 crc kubenswrapper[4704]: I1125 16:06:07.781509 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zqxfm_c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13/extract-utilities/0.log" Nov 25 16:06:07 crc kubenswrapper[4704]: I1125 16:06:07.863436 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zqxfm_c0dcd3b2-1c21-4b62-8dc1-9c5ef2ee7a13/registry-server/0.log" Nov 25 16:06:07 crc kubenswrapper[4704]: I1125 16:06:07.937336 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-827sg_81c4aee6-0cc5-4f75-98b9-d546819ce1df/extract-utilities/0.log" Nov 25 16:06:08 crc kubenswrapper[4704]: I1125 16:06:08.108317 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-827sg_81c4aee6-0cc5-4f75-98b9-d546819ce1df/extract-utilities/0.log" Nov 25 16:06:08 crc kubenswrapper[4704]: I1125 16:06:08.108962 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-827sg_81c4aee6-0cc5-4f75-98b9-d546819ce1df/extract-content/0.log" Nov 25 16:06:08 crc kubenswrapper[4704]: I1125 16:06:08.131403 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-827sg_81c4aee6-0cc5-4f75-98b9-d546819ce1df/extract-content/0.log" Nov 25 16:06:08 crc kubenswrapper[4704]: I1125 16:06:08.275445 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-827sg_81c4aee6-0cc5-4f75-98b9-d546819ce1df/extract-utilities/0.log" Nov 25 16:06:08 crc kubenswrapper[4704]: I1125 16:06:08.343605 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-827sg_81c4aee6-0cc5-4f75-98b9-d546819ce1df/extract-content/0.log" Nov 25 16:06:08 crc kubenswrapper[4704]: I1125 16:06:08.840928 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-827sg_81c4aee6-0cc5-4f75-98b9-d546819ce1df/registry-server/0.log" Nov 25 16:06:17 crc kubenswrapper[4704]: I1125 16:06:17.715568 4704 scope.go:117] "RemoveContainer" containerID="98389b4dee9c0d9a0c12f328d9bcc3ccebdc341cc1873f0b5cd310cc29b2ecd0" Nov 25 16:06:17 crc kubenswrapper[4704]: I1125 16:06:17.767103 4704 scope.go:117] "RemoveContainer" containerID="51b9520d80823ddde7c03722665803d37af915c16cbd1be2a90c02c02766515a" Nov 25 16:06:18 crc kubenswrapper[4704]: I1125 16:06:18.416961 4704 scope.go:117] "RemoveContainer" containerID="2d40455a2b5295ff7b13e8f8c3068640929072e4f22451a92dc5d7701185f9fd" Nov 25 16:06:18 crc kubenswrapper[4704]: E1125 16:06:18.417333 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djz8x_openshift-machine-config-operator(91b52682-d008-4b8a-8bc3-26b032d7dc2c)\"" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" Nov 25 16:06:32 crc kubenswrapper[4704]: I1125 16:06:32.416920 4704 scope.go:117] "RemoveContainer" containerID="2d40455a2b5295ff7b13e8f8c3068640929072e4f22451a92dc5d7701185f9fd" Nov 25 16:06:32 crc kubenswrapper[4704]: E1125 16:06:32.418005 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djz8x_openshift-machine-config-operator(91b52682-d008-4b8a-8bc3-26b032d7dc2c)\"" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" Nov 25 16:06:43 crc kubenswrapper[4704]: I1125 16:06:43.417448 4704 scope.go:117] "RemoveContainer" containerID="2d40455a2b5295ff7b13e8f8c3068640929072e4f22451a92dc5d7701185f9fd" Nov 25 16:06:43 crc kubenswrapper[4704]: E1125 16:06:43.418583 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djz8x_openshift-machine-config-operator(91b52682-d008-4b8a-8bc3-26b032d7dc2c)\"" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" Nov 25 16:06:56 crc kubenswrapper[4704]: I1125 16:06:56.419372 4704 scope.go:117] "RemoveContainer" containerID="2d40455a2b5295ff7b13e8f8c3068640929072e4f22451a92dc5d7701185f9fd" Nov 25 16:06:56 crc kubenswrapper[4704]: E1125 16:06:56.420272 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djz8x_openshift-machine-config-operator(91b52682-d008-4b8a-8bc3-26b032d7dc2c)\"" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" Nov 25 16:07:11 crc kubenswrapper[4704]: I1125 16:07:11.416171 4704 scope.go:117] "RemoveContainer" containerID="2d40455a2b5295ff7b13e8f8c3068640929072e4f22451a92dc5d7701185f9fd" Nov 25 16:07:11 crc kubenswrapper[4704]: E1125 16:07:11.417138 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djz8x_openshift-machine-config-operator(91b52682-d008-4b8a-8bc3-26b032d7dc2c)\"" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" Nov 25 16:07:19 crc kubenswrapper[4704]: I1125 16:07:19.119256 4704 generic.go:334] "Generic (PLEG): container finished" podID="64da260d-a929-4455-b1ad-76b6d3d0fd38" containerID="f1afbef17ebf9f71bcf99c8d9c11bbbf4a0d8abd9454d590113ef5538a28f1db" exitCode=0 Nov 25 16:07:19 crc kubenswrapper[4704]: I1125 16:07:19.119391 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d78tg/must-gather-v5dx9" event={"ID":"64da260d-a929-4455-b1ad-76b6d3d0fd38","Type":"ContainerDied","Data":"f1afbef17ebf9f71bcf99c8d9c11bbbf4a0d8abd9454d590113ef5538a28f1db"} Nov 25 16:07:19 crc kubenswrapper[4704]: I1125 16:07:19.120282 4704 scope.go:117] "RemoveContainer" containerID="f1afbef17ebf9f71bcf99c8d9c11bbbf4a0d8abd9454d590113ef5538a28f1db" Nov 25 16:07:19 crc kubenswrapper[4704]: I1125 16:07:19.522640 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d78tg_must-gather-v5dx9_64da260d-a929-4455-b1ad-76b6d3d0fd38/gather/0.log" Nov 25 16:07:26 crc kubenswrapper[4704]: I1125 16:07:26.340395 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d78tg/must-gather-v5dx9"] Nov 25 16:07:26 crc kubenswrapper[4704]: I1125 16:07:26.343051 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-d78tg/must-gather-v5dx9" podUID="64da260d-a929-4455-b1ad-76b6d3d0fd38" containerName="copy" containerID="cri-o://e8eee619e08966294a86f60a8ba81cfb374a29c9134edb82c64be286b8b2038a" gracePeriod=2 Nov 25 16:07:26 crc kubenswrapper[4704]: I1125 16:07:26.347021 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d78tg/must-gather-v5dx9"] Nov 25 16:07:26 crc kubenswrapper[4704]: I1125 16:07:26.416617 4704 scope.go:117] "RemoveContainer" containerID="2d40455a2b5295ff7b13e8f8c3068640929072e4f22451a92dc5d7701185f9fd" Nov 25 16:07:26 crc kubenswrapper[4704]: E1125 16:07:26.416947 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djz8x_openshift-machine-config-operator(91b52682-d008-4b8a-8bc3-26b032d7dc2c)\"" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" Nov 25 16:07:26 crc kubenswrapper[4704]: I1125 16:07:26.755673 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d78tg_must-gather-v5dx9_64da260d-a929-4455-b1ad-76b6d3d0fd38/copy/0.log" Nov 25 16:07:26 crc kubenswrapper[4704]: I1125 16:07:26.756528 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d78tg/must-gather-v5dx9" Nov 25 16:07:26 crc kubenswrapper[4704]: I1125 16:07:26.926085 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/64da260d-a929-4455-b1ad-76b6d3d0fd38-must-gather-output\") pod \"64da260d-a929-4455-b1ad-76b6d3d0fd38\" (UID: \"64da260d-a929-4455-b1ad-76b6d3d0fd38\") " Nov 25 16:07:26 crc kubenswrapper[4704]: I1125 16:07:26.926702 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjkj9\" (UniqueName: \"kubernetes.io/projected/64da260d-a929-4455-b1ad-76b6d3d0fd38-kube-api-access-gjkj9\") pod \"64da260d-a929-4455-b1ad-76b6d3d0fd38\" (UID: \"64da260d-a929-4455-b1ad-76b6d3d0fd38\") " Nov 25 16:07:26 crc kubenswrapper[4704]: I1125 16:07:26.953183 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64da260d-a929-4455-b1ad-76b6d3d0fd38-kube-api-access-gjkj9" (OuterVolumeSpecName: "kube-api-access-gjkj9") pod "64da260d-a929-4455-b1ad-76b6d3d0fd38" (UID: "64da260d-a929-4455-b1ad-76b6d3d0fd38"). InnerVolumeSpecName "kube-api-access-gjkj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:07:27 crc kubenswrapper[4704]: I1125 16:07:27.035042 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjkj9\" (UniqueName: \"kubernetes.io/projected/64da260d-a929-4455-b1ad-76b6d3d0fd38-kube-api-access-gjkj9\") on node \"crc\" DevicePath \"\"" Nov 25 16:07:27 crc kubenswrapper[4704]: I1125 16:07:27.037141 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64da260d-a929-4455-b1ad-76b6d3d0fd38-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "64da260d-a929-4455-b1ad-76b6d3d0fd38" (UID: "64da260d-a929-4455-b1ad-76b6d3d0fd38"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:07:27 crc kubenswrapper[4704]: I1125 16:07:27.136459 4704 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/64da260d-a929-4455-b1ad-76b6d3d0fd38-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 25 16:07:27 crc kubenswrapper[4704]: I1125 16:07:27.198669 4704 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d78tg_must-gather-v5dx9_64da260d-a929-4455-b1ad-76b6d3d0fd38/copy/0.log" Nov 25 16:07:27 crc kubenswrapper[4704]: I1125 16:07:27.199354 4704 generic.go:334] "Generic (PLEG): container finished" podID="64da260d-a929-4455-b1ad-76b6d3d0fd38" containerID="e8eee619e08966294a86f60a8ba81cfb374a29c9134edb82c64be286b8b2038a" exitCode=143 Nov 25 16:07:27 crc kubenswrapper[4704]: I1125 16:07:27.199467 4704 scope.go:117] "RemoveContainer" containerID="e8eee619e08966294a86f60a8ba81cfb374a29c9134edb82c64be286b8b2038a" Nov 25 16:07:27 crc kubenswrapper[4704]: I1125 16:07:27.199486 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d78tg/must-gather-v5dx9" Nov 25 16:07:27 crc kubenswrapper[4704]: I1125 16:07:27.222474 4704 scope.go:117] "RemoveContainer" containerID="f1afbef17ebf9f71bcf99c8d9c11bbbf4a0d8abd9454d590113ef5538a28f1db" Nov 25 16:07:27 crc kubenswrapper[4704]: I1125 16:07:27.268345 4704 scope.go:117] "RemoveContainer" containerID="e8eee619e08966294a86f60a8ba81cfb374a29c9134edb82c64be286b8b2038a" Nov 25 16:07:27 crc kubenswrapper[4704]: E1125 16:07:27.269009 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8eee619e08966294a86f60a8ba81cfb374a29c9134edb82c64be286b8b2038a\": container with ID starting with e8eee619e08966294a86f60a8ba81cfb374a29c9134edb82c64be286b8b2038a not found: ID does not exist" containerID="e8eee619e08966294a86f60a8ba81cfb374a29c9134edb82c64be286b8b2038a" Nov 25 16:07:27 crc kubenswrapper[4704]: I1125 16:07:27.269118 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8eee619e08966294a86f60a8ba81cfb374a29c9134edb82c64be286b8b2038a"} err="failed to get container status \"e8eee619e08966294a86f60a8ba81cfb374a29c9134edb82c64be286b8b2038a\": rpc error: code = NotFound desc = could not find container \"e8eee619e08966294a86f60a8ba81cfb374a29c9134edb82c64be286b8b2038a\": container with ID starting with e8eee619e08966294a86f60a8ba81cfb374a29c9134edb82c64be286b8b2038a not found: ID does not exist" Nov 25 16:07:27 crc kubenswrapper[4704]: I1125 16:07:27.269203 4704 scope.go:117] "RemoveContainer" containerID="f1afbef17ebf9f71bcf99c8d9c11bbbf4a0d8abd9454d590113ef5538a28f1db" Nov 25 16:07:27 crc kubenswrapper[4704]: E1125 16:07:27.269595 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1afbef17ebf9f71bcf99c8d9c11bbbf4a0d8abd9454d590113ef5538a28f1db\": container with ID starting with f1afbef17ebf9f71bcf99c8d9c11bbbf4a0d8abd9454d590113ef5538a28f1db not found: ID does not exist" containerID="f1afbef17ebf9f71bcf99c8d9c11bbbf4a0d8abd9454d590113ef5538a28f1db" Nov 25 16:07:27 crc kubenswrapper[4704]: I1125 16:07:27.269643 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1afbef17ebf9f71bcf99c8d9c11bbbf4a0d8abd9454d590113ef5538a28f1db"} err="failed to get container status \"f1afbef17ebf9f71bcf99c8d9c11bbbf4a0d8abd9454d590113ef5538a28f1db\": rpc error: code = NotFound desc = could not find container \"f1afbef17ebf9f71bcf99c8d9c11bbbf4a0d8abd9454d590113ef5538a28f1db\": container with ID starting with f1afbef17ebf9f71bcf99c8d9c11bbbf4a0d8abd9454d590113ef5538a28f1db not found: ID does not exist" Nov 25 16:07:28 crc kubenswrapper[4704]: I1125 16:07:28.424738 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64da260d-a929-4455-b1ad-76b6d3d0fd38" path="/var/lib/kubelet/pods/64da260d-a929-4455-b1ad-76b6d3d0fd38/volumes" Nov 25 16:07:38 crc kubenswrapper[4704]: I1125 16:07:38.416865 4704 scope.go:117] "RemoveContainer" containerID="2d40455a2b5295ff7b13e8f8c3068640929072e4f22451a92dc5d7701185f9fd" Nov 25 16:07:38 crc kubenswrapper[4704]: E1125 16:07:38.417914 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djz8x_openshift-machine-config-operator(91b52682-d008-4b8a-8bc3-26b032d7dc2c)\"" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" Nov 25 16:07:53 crc kubenswrapper[4704]: I1125 16:07:53.416150 4704 scope.go:117] "RemoveContainer" containerID="2d40455a2b5295ff7b13e8f8c3068640929072e4f22451a92dc5d7701185f9fd" Nov 25 16:07:53 crc kubenswrapper[4704]: E1125 16:07:53.416765 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djz8x_openshift-machine-config-operator(91b52682-d008-4b8a-8bc3-26b032d7dc2c)\"" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" Nov 25 16:08:08 crc kubenswrapper[4704]: I1125 16:08:08.417667 4704 scope.go:117] "RemoveContainer" containerID="2d40455a2b5295ff7b13e8f8c3068640929072e4f22451a92dc5d7701185f9fd" Nov 25 16:08:08 crc kubenswrapper[4704]: E1125 16:08:08.419733 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djz8x_openshift-machine-config-operator(91b52682-d008-4b8a-8bc3-26b032d7dc2c)\"" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" Nov 25 16:08:22 crc kubenswrapper[4704]: I1125 16:08:22.416257 4704 scope.go:117] "RemoveContainer" containerID="2d40455a2b5295ff7b13e8f8c3068640929072e4f22451a92dc5d7701185f9fd" Nov 25 16:08:22 crc kubenswrapper[4704]: E1125 16:08:22.417043 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djz8x_openshift-machine-config-operator(91b52682-d008-4b8a-8bc3-26b032d7dc2c)\"" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" Nov 25 16:08:35 crc kubenswrapper[4704]: I1125 16:08:35.416944 4704 scope.go:117] "RemoveContainer" containerID="2d40455a2b5295ff7b13e8f8c3068640929072e4f22451a92dc5d7701185f9fd" Nov 25 16:08:35 crc kubenswrapper[4704]: E1125 16:08:35.418031 4704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-djz8x_openshift-machine-config-operator(91b52682-d008-4b8a-8bc3-26b032d7dc2c)\"" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" podUID="91b52682-d008-4b8a-8bc3-26b032d7dc2c" Nov 25 16:08:50 crc kubenswrapper[4704]: I1125 16:08:50.416856 4704 scope.go:117] "RemoveContainer" containerID="2d40455a2b5295ff7b13e8f8c3068640929072e4f22451a92dc5d7701185f9fd" Nov 25 16:08:50 crc kubenswrapper[4704]: I1125 16:08:50.782645 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-djz8x" event={"ID":"91b52682-d008-4b8a-8bc3-26b032d7dc2c","Type":"ContainerStarted","Data":"4c4c8fe9a394ae6b0fe625b7feadfdf60ae0074131d758b9041afd9beab81cd1"} Nov 25 16:09:17 crc kubenswrapper[4704]: I1125 16:09:17.649402 4704 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wxkrt"] Nov 25 16:09:17 crc kubenswrapper[4704]: E1125 16:09:17.650415 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64da260d-a929-4455-b1ad-76b6d3d0fd38" containerName="gather" Nov 25 16:09:17 crc kubenswrapper[4704]: I1125 16:09:17.650434 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="64da260d-a929-4455-b1ad-76b6d3d0fd38" containerName="gather" Nov 25 16:09:17 crc kubenswrapper[4704]: E1125 16:09:17.650464 4704 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64da260d-a929-4455-b1ad-76b6d3d0fd38" containerName="copy" Nov 25 16:09:17 crc kubenswrapper[4704]: I1125 16:09:17.650472 4704 state_mem.go:107] "Deleted CPUSet assignment" podUID="64da260d-a929-4455-b1ad-76b6d3d0fd38" containerName="copy" Nov 25 16:09:17 crc kubenswrapper[4704]: I1125 16:09:17.650659 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="64da260d-a929-4455-b1ad-76b6d3d0fd38" containerName="gather" Nov 25 16:09:17 crc kubenswrapper[4704]: I1125 16:09:17.650681 4704 memory_manager.go:354] "RemoveStaleState removing state" podUID="64da260d-a929-4455-b1ad-76b6d3d0fd38" containerName="copy" Nov 25 16:09:17 crc kubenswrapper[4704]: I1125 16:09:17.652229 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wxkrt" Nov 25 16:09:17 crc kubenswrapper[4704]: I1125 16:09:17.656598 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wxkrt"] Nov 25 16:09:17 crc kubenswrapper[4704]: I1125 16:09:17.789764 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx4fk\" (UniqueName: \"kubernetes.io/projected/ebed1169-f829-44b7-bb8b-8a4c9633e8a8-kube-api-access-hx4fk\") pod \"community-operators-wxkrt\" (UID: \"ebed1169-f829-44b7-bb8b-8a4c9633e8a8\") " pod="openshift-marketplace/community-operators-wxkrt" Nov 25 16:09:17 crc kubenswrapper[4704]: I1125 16:09:17.789826 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebed1169-f829-44b7-bb8b-8a4c9633e8a8-catalog-content\") pod \"community-operators-wxkrt\" (UID: \"ebed1169-f829-44b7-bb8b-8a4c9633e8a8\") " pod="openshift-marketplace/community-operators-wxkrt" Nov 25 16:09:17 crc kubenswrapper[4704]: I1125 16:09:17.789948 4704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebed1169-f829-44b7-bb8b-8a4c9633e8a8-utilities\") pod \"community-operators-wxkrt\" (UID: \"ebed1169-f829-44b7-bb8b-8a4c9633e8a8\") " pod="openshift-marketplace/community-operators-wxkrt" Nov 25 16:09:17 crc kubenswrapper[4704]: I1125 16:09:17.890888 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx4fk\" (UniqueName: \"kubernetes.io/projected/ebed1169-f829-44b7-bb8b-8a4c9633e8a8-kube-api-access-hx4fk\") pod \"community-operators-wxkrt\" (UID: \"ebed1169-f829-44b7-bb8b-8a4c9633e8a8\") " pod="openshift-marketplace/community-operators-wxkrt" Nov 25 16:09:17 crc kubenswrapper[4704]: I1125 16:09:17.890933 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebed1169-f829-44b7-bb8b-8a4c9633e8a8-catalog-content\") pod \"community-operators-wxkrt\" (UID: \"ebed1169-f829-44b7-bb8b-8a4c9633e8a8\") " pod="openshift-marketplace/community-operators-wxkrt" Nov 25 16:09:17 crc kubenswrapper[4704]: I1125 16:09:17.890978 4704 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebed1169-f829-44b7-bb8b-8a4c9633e8a8-utilities\") pod \"community-operators-wxkrt\" (UID: \"ebed1169-f829-44b7-bb8b-8a4c9633e8a8\") " pod="openshift-marketplace/community-operators-wxkrt" Nov 25 16:09:17 crc kubenswrapper[4704]: I1125 16:09:17.891524 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebed1169-f829-44b7-bb8b-8a4c9633e8a8-catalog-content\") pod \"community-operators-wxkrt\" (UID: \"ebed1169-f829-44b7-bb8b-8a4c9633e8a8\") " pod="openshift-marketplace/community-operators-wxkrt" Nov 25 16:09:17 crc kubenswrapper[4704]: I1125 16:09:17.891573 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebed1169-f829-44b7-bb8b-8a4c9633e8a8-utilities\") pod \"community-operators-wxkrt\" (UID: \"ebed1169-f829-44b7-bb8b-8a4c9633e8a8\") " pod="openshift-marketplace/community-operators-wxkrt" Nov 25 16:09:17 crc kubenswrapper[4704]: I1125 16:09:17.908888 4704 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx4fk\" (UniqueName: \"kubernetes.io/projected/ebed1169-f829-44b7-bb8b-8a4c9633e8a8-kube-api-access-hx4fk\") pod \"community-operators-wxkrt\" (UID: \"ebed1169-f829-44b7-bb8b-8a4c9633e8a8\") " pod="openshift-marketplace/community-operators-wxkrt" Nov 25 16:09:17 crc kubenswrapper[4704]: I1125 16:09:17.972070 4704 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wxkrt" Nov 25 16:09:18 crc kubenswrapper[4704]: I1125 16:09:18.254498 4704 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wxkrt"] Nov 25 16:09:18 crc kubenswrapper[4704]: I1125 16:09:18.998585 4704 generic.go:334] "Generic (PLEG): container finished" podID="ebed1169-f829-44b7-bb8b-8a4c9633e8a8" containerID="876ed98704db044e96e731b452e4783e0aec6cff16bee47d334b8959579baeba" exitCode=0 Nov 25 16:09:18 crc kubenswrapper[4704]: I1125 16:09:18.998664 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wxkrt" event={"ID":"ebed1169-f829-44b7-bb8b-8a4c9633e8a8","Type":"ContainerDied","Data":"876ed98704db044e96e731b452e4783e0aec6cff16bee47d334b8959579baeba"} Nov 25 16:09:18 crc kubenswrapper[4704]: I1125 16:09:18.998942 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wxkrt" event={"ID":"ebed1169-f829-44b7-bb8b-8a4c9633e8a8","Type":"ContainerStarted","Data":"cd5ea19ae6a3929fc597fb31f5962ef403329c83164a95781d6e5172a044d384"} Nov 25 16:09:21 crc kubenswrapper[4704]: I1125 16:09:21.019223 4704 generic.go:334] "Generic (PLEG): container finished" podID="ebed1169-f829-44b7-bb8b-8a4c9633e8a8" containerID="3efb4bc17e0aa7545bc30b53177c613bbdcfa0a3f6cc1a97b94075aa74258870" exitCode=0 Nov 25 16:09:21 crc kubenswrapper[4704]: I1125 16:09:21.019294 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wxkrt" event={"ID":"ebed1169-f829-44b7-bb8b-8a4c9633e8a8","Type":"ContainerDied","Data":"3efb4bc17e0aa7545bc30b53177c613bbdcfa0a3f6cc1a97b94075aa74258870"} Nov 25 16:09:21 crc kubenswrapper[4704]: I1125 16:09:21.020900 4704 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 16:09:22 crc kubenswrapper[4704]: I1125 16:09:22.030425 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wxkrt" event={"ID":"ebed1169-f829-44b7-bb8b-8a4c9633e8a8","Type":"ContainerStarted","Data":"b854465a89918a183133a4167f7d3d589432321f7075dacd7c32a862acbf350b"} Nov 25 16:09:27 crc kubenswrapper[4704]: I1125 16:09:27.972830 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wxkrt" Nov 25 16:09:27 crc kubenswrapper[4704]: I1125 16:09:27.973479 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wxkrt" Nov 25 16:09:28 crc kubenswrapper[4704]: I1125 16:09:28.019469 4704 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wxkrt" Nov 25 16:09:28 crc kubenswrapper[4704]: I1125 16:09:28.049189 4704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wxkrt" podStartSLOduration=8.632820312 podStartE2EDuration="11.049154398s" podCreationTimestamp="2025-11-25 16:09:17 +0000 UTC" firstStartedPulling="2025-11-25 16:09:18.999875275 +0000 UTC m=+2045.268149046" lastFinishedPulling="2025-11-25 16:09:21.416209351 +0000 UTC m=+2047.684483132" observedRunningTime="2025-11-25 16:09:22.055412012 +0000 UTC m=+2048.323685803" watchObservedRunningTime="2025-11-25 16:09:28.049154398 +0000 UTC m=+2054.317428179" Nov 25 16:09:28 crc kubenswrapper[4704]: I1125 16:09:28.120885 4704 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wxkrt" Nov 25 16:09:28 crc kubenswrapper[4704]: I1125 16:09:28.255184 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wxkrt"] Nov 25 16:09:30 crc kubenswrapper[4704]: I1125 16:09:30.091422 4704 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wxkrt" podUID="ebed1169-f829-44b7-bb8b-8a4c9633e8a8" containerName="registry-server" containerID="cri-o://b854465a89918a183133a4167f7d3d589432321f7075dacd7c32a862acbf350b" gracePeriod=2 Nov 25 16:09:30 crc kubenswrapper[4704]: I1125 16:09:30.463239 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wxkrt" Nov 25 16:09:30 crc kubenswrapper[4704]: I1125 16:09:30.580894 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx4fk\" (UniqueName: \"kubernetes.io/projected/ebed1169-f829-44b7-bb8b-8a4c9633e8a8-kube-api-access-hx4fk\") pod \"ebed1169-f829-44b7-bb8b-8a4c9633e8a8\" (UID: \"ebed1169-f829-44b7-bb8b-8a4c9633e8a8\") " Nov 25 16:09:30 crc kubenswrapper[4704]: I1125 16:09:30.581279 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebed1169-f829-44b7-bb8b-8a4c9633e8a8-catalog-content\") pod \"ebed1169-f829-44b7-bb8b-8a4c9633e8a8\" (UID: \"ebed1169-f829-44b7-bb8b-8a4c9633e8a8\") " Nov 25 16:09:30 crc kubenswrapper[4704]: I1125 16:09:30.581336 4704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebed1169-f829-44b7-bb8b-8a4c9633e8a8-utilities\") pod \"ebed1169-f829-44b7-bb8b-8a4c9633e8a8\" (UID: \"ebed1169-f829-44b7-bb8b-8a4c9633e8a8\") " Nov 25 16:09:30 crc kubenswrapper[4704]: I1125 16:09:30.582528 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebed1169-f829-44b7-bb8b-8a4c9633e8a8-utilities" (OuterVolumeSpecName: "utilities") pod "ebed1169-f829-44b7-bb8b-8a4c9633e8a8" (UID: "ebed1169-f829-44b7-bb8b-8a4c9633e8a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:09:30 crc kubenswrapper[4704]: I1125 16:09:30.588568 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebed1169-f829-44b7-bb8b-8a4c9633e8a8-kube-api-access-hx4fk" (OuterVolumeSpecName: "kube-api-access-hx4fk") pod "ebed1169-f829-44b7-bb8b-8a4c9633e8a8" (UID: "ebed1169-f829-44b7-bb8b-8a4c9633e8a8"). InnerVolumeSpecName "kube-api-access-hx4fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 16:09:30 crc kubenswrapper[4704]: I1125 16:09:30.625585 4704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebed1169-f829-44b7-bb8b-8a4c9633e8a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebed1169-f829-44b7-bb8b-8a4c9633e8a8" (UID: "ebed1169-f829-44b7-bb8b-8a4c9633e8a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 16:09:30 crc kubenswrapper[4704]: I1125 16:09:30.684072 4704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx4fk\" (UniqueName: \"kubernetes.io/projected/ebed1169-f829-44b7-bb8b-8a4c9633e8a8-kube-api-access-hx4fk\") on node \"crc\" DevicePath \"\"" Nov 25 16:09:30 crc kubenswrapper[4704]: I1125 16:09:30.684119 4704 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebed1169-f829-44b7-bb8b-8a4c9633e8a8-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 16:09:30 crc kubenswrapper[4704]: I1125 16:09:30.684137 4704 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebed1169-f829-44b7-bb8b-8a4c9633e8a8-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 16:09:31 crc kubenswrapper[4704]: I1125 16:09:31.098433 4704 generic.go:334] "Generic (PLEG): container finished" podID="ebed1169-f829-44b7-bb8b-8a4c9633e8a8" containerID="b854465a89918a183133a4167f7d3d589432321f7075dacd7c32a862acbf350b" exitCode=0 Nov 25 16:09:31 crc kubenswrapper[4704]: I1125 16:09:31.098475 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wxkrt" event={"ID":"ebed1169-f829-44b7-bb8b-8a4c9633e8a8","Type":"ContainerDied","Data":"b854465a89918a183133a4167f7d3d589432321f7075dacd7c32a862acbf350b"} Nov 25 16:09:31 crc kubenswrapper[4704]: I1125 16:09:31.098496 4704 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wxkrt" Nov 25 16:09:31 crc kubenswrapper[4704]: I1125 16:09:31.098519 4704 scope.go:117] "RemoveContainer" containerID="b854465a89918a183133a4167f7d3d589432321f7075dacd7c32a862acbf350b" Nov 25 16:09:31 crc kubenswrapper[4704]: I1125 16:09:31.098505 4704 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wxkrt" event={"ID":"ebed1169-f829-44b7-bb8b-8a4c9633e8a8","Type":"ContainerDied","Data":"cd5ea19ae6a3929fc597fb31f5962ef403329c83164a95781d6e5172a044d384"} Nov 25 16:09:31 crc kubenswrapper[4704]: I1125 16:09:31.116441 4704 scope.go:117] "RemoveContainer" containerID="3efb4bc17e0aa7545bc30b53177c613bbdcfa0a3f6cc1a97b94075aa74258870" Nov 25 16:09:31 crc kubenswrapper[4704]: I1125 16:09:31.123511 4704 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wxkrt"] Nov 25 16:09:31 crc kubenswrapper[4704]: I1125 16:09:31.128653 4704 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wxkrt"] Nov 25 16:09:31 crc kubenswrapper[4704]: I1125 16:09:31.154519 4704 scope.go:117] "RemoveContainer" containerID="876ed98704db044e96e731b452e4783e0aec6cff16bee47d334b8959579baeba" Nov 25 16:09:31 crc kubenswrapper[4704]: I1125 16:09:31.175971 4704 scope.go:117] "RemoveContainer" containerID="b854465a89918a183133a4167f7d3d589432321f7075dacd7c32a862acbf350b" Nov 25 16:09:31 crc kubenswrapper[4704]: E1125 16:09:31.176403 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b854465a89918a183133a4167f7d3d589432321f7075dacd7c32a862acbf350b\": container with ID starting with b854465a89918a183133a4167f7d3d589432321f7075dacd7c32a862acbf350b not found: ID does not exist" containerID="b854465a89918a183133a4167f7d3d589432321f7075dacd7c32a862acbf350b" Nov 25 16:09:31 crc kubenswrapper[4704]: I1125 16:09:31.176441 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b854465a89918a183133a4167f7d3d589432321f7075dacd7c32a862acbf350b"} err="failed to get container status \"b854465a89918a183133a4167f7d3d589432321f7075dacd7c32a862acbf350b\": rpc error: code = NotFound desc = could not find container \"b854465a89918a183133a4167f7d3d589432321f7075dacd7c32a862acbf350b\": container with ID starting with b854465a89918a183133a4167f7d3d589432321f7075dacd7c32a862acbf350b not found: ID does not exist" Nov 25 16:09:31 crc kubenswrapper[4704]: I1125 16:09:31.176465 4704 scope.go:117] "RemoveContainer" containerID="3efb4bc17e0aa7545bc30b53177c613bbdcfa0a3f6cc1a97b94075aa74258870" Nov 25 16:09:31 crc kubenswrapper[4704]: E1125 16:09:31.176752 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3efb4bc17e0aa7545bc30b53177c613bbdcfa0a3f6cc1a97b94075aa74258870\": container with ID starting with 3efb4bc17e0aa7545bc30b53177c613bbdcfa0a3f6cc1a97b94075aa74258870 not found: ID does not exist" containerID="3efb4bc17e0aa7545bc30b53177c613bbdcfa0a3f6cc1a97b94075aa74258870" Nov 25 16:09:31 crc kubenswrapper[4704]: I1125 16:09:31.177016 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3efb4bc17e0aa7545bc30b53177c613bbdcfa0a3f6cc1a97b94075aa74258870"} err="failed to get container status \"3efb4bc17e0aa7545bc30b53177c613bbdcfa0a3f6cc1a97b94075aa74258870\": rpc error: code = NotFound desc = could not find container \"3efb4bc17e0aa7545bc30b53177c613bbdcfa0a3f6cc1a97b94075aa74258870\": container with ID starting with 3efb4bc17e0aa7545bc30b53177c613bbdcfa0a3f6cc1a97b94075aa74258870 not found: ID does not exist" Nov 25 16:09:31 crc kubenswrapper[4704]: I1125 16:09:31.177034 4704 scope.go:117] "RemoveContainer" containerID="876ed98704db044e96e731b452e4783e0aec6cff16bee47d334b8959579baeba" Nov 25 16:09:31 crc kubenswrapper[4704]: E1125 16:09:31.177278 4704 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"876ed98704db044e96e731b452e4783e0aec6cff16bee47d334b8959579baeba\": container with ID starting with 876ed98704db044e96e731b452e4783e0aec6cff16bee47d334b8959579baeba not found: ID does not exist" containerID="876ed98704db044e96e731b452e4783e0aec6cff16bee47d334b8959579baeba" Nov 25 16:09:31 crc kubenswrapper[4704]: I1125 16:09:31.177298 4704 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"876ed98704db044e96e731b452e4783e0aec6cff16bee47d334b8959579baeba"} err="failed to get container status \"876ed98704db044e96e731b452e4783e0aec6cff16bee47d334b8959579baeba\": rpc error: code = NotFound desc = could not find container \"876ed98704db044e96e731b452e4783e0aec6cff16bee47d334b8959579baeba\": container with ID starting with 876ed98704db044e96e731b452e4783e0aec6cff16bee47d334b8959579baeba not found: ID does not exist" Nov 25 16:09:32 crc kubenswrapper[4704]: I1125 16:09:32.424273 4704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebed1169-f829-44b7-bb8b-8a4c9633e8a8" path="/var/lib/kubelet/pods/ebed1169-f829-44b7-bb8b-8a4c9633e8a8/volumes"